1
00:00:00,099 --> 00:00:15,064
34c3 preroll music
2
00:00:15,064 --> 00:00:19,240
Herald Angel (H): OK, probably a couple
years ago you realize that a lot of the
3
00:00:19,240 --> 00:00:25,650
refugees coming up from Syria and North
Africa where we're communicating. We're
4
00:00:25,650 --> 00:00:29,916
using technology in an interesting way to
find their way around a lot of the Border
5
00:00:29,916 --> 00:00:36,999
Patrol's. A lot of the hurdles that were
put up in their way. In the US we have a
6
00:00:36,999 --> 00:00:41,780
similar issue but it's different in many
ways with illegal immigrants trying to
7
00:00:41,780 --> 00:00:47,909
stay underneath the radar. Allison
McDonald from the University of Michigan
8
00:00:47,909 --> 00:00:54,059
is, has been studying how immigrants in
the States deal with technology and it's
9
00:00:54,059 --> 00:00:59,739
very different from here. Her interests
are in technology, privacy, society and
10
00:00:59,739 --> 00:01:03,839
human rights and I think we're gonna have
an awesome talk from her. So, well, please
11
00:01:03,839 --> 00:01:16,335
welcome her and we'll get moving.
Applause
12
00:01:16,335 --> 00:01:19,909
Allision McDonald: OK, thanks for coming.
I'm Allison from the University of
13
00:01:19,909 --> 00:01:26,049
Michigan. I'm talking today primarily
about technology in immigration
14
00:01:26,049 --> 00:01:30,469
enforcement and specifically about how the
immigrant community in the United States
15
00:01:30,469 --> 00:01:36,170
is responding to those changes and
especially the undocumented community.
16
00:01:36,170 --> 00:01:40,159
Before we get too far into the details I
just wanted to tell a little bit of a
17
00:01:40,159 --> 00:01:45,479
story. This is Anna Maria she is not a
real person she is sort of a compositive
18
00:01:45,479 --> 00:01:49,890
of many people that we spoke to but her
story is really representative of a lot of
19
00:01:49,890 --> 00:01:56,249
people that we know are living in
the United States today. She and her
20
00:01:56,249 --> 00:02:02,519
husband emigrated from Mexico about 12
years ago into the United States. She
21
00:02:02,519 --> 00:02:06,370
really wanted to have children, but
couldn't get the fertility support that
22
00:02:06,370 --> 00:02:10,560
she needed in Mexico so she came to the
United States. And now she and her husband
23
00:02:10,560 --> 00:02:17,530
have two children who are attending US
public schools. She and her husband are
24
00:02:17,530 --> 00:02:22,507
both working and saving up to buy a
house. They pay taxes; they attend church
25
00:02:22,507 --> 00:02:27,070
every Sunday. They're involved in a lot of
community events and are really integrated
26
00:02:27,070 --> 00:02:33,750
into the local community. One
difference from Anna Maria and a lot of
27
00:02:33,750 --> 00:02:39,930
other people is that she's in the United
States as an undocumented immigrant. What
28
00:02:39,930 --> 00:02:45,190
this means is that she either entered the
United States without legal authorization
29
00:02:45,190 --> 00:02:56,090
or she came on a Visa and overstayed the
allotted time. That means that day to day
30
00:02:56,090 --> 00:03:03,290
she has to worry about being found and
deported back to Mexico, removed from her
31
00:03:03,290 --> 00:03:11,370
home and this puts her in quite a
precarious situation trying to live a
32
00:03:11,370 --> 00:03:17,070
normal life, a life similar to a lot of
other people in our communities. But with
33
00:03:17,070 --> 00:03:21,490
this constant concern that this life
could be taken away from her if she's
34
00:03:21,490 --> 00:03:28,160
detected. Other than this this one point
she really lives this immigration story
35
00:03:28,160 --> 00:03:34,200
that the United States loves to tell. We
love to have this narrative of people
36
00:03:34,200 --> 00:03:37,830
being able to come to the United States
and build lives for themselves that they
37
00:03:37,830 --> 00:03:42,500
might not be able to build in their
origin countries. And that's exactly
38
00:03:42,500 --> 00:03:51,200
what she's done. But just as natural to
this immigration story is a history of a
39
00:03:51,200 --> 00:03:58,650
lot of discrimination, racism and
xenophobia. All the way back in the 1700s
40
00:03:58,650 --> 00:04:03,780
we've had legislation that prevents people
from becoming citizens based on their
41
00:04:03,780 --> 00:04:11,020
origin country. We've had, for example,
the Chinese Exclusion Act preventing
42
00:04:11,020 --> 00:04:16,798
people from China laborers coming to the
United States entirely. The Asiatic barred
43
00:04:16,798 --> 00:04:21,048
zone a couple years later just drew a box
on a map and said the people in this
44
00:04:21,048 --> 00:04:25,910
region can't immigrate to the United
States. We've also seen things like the
45
00:04:25,910 --> 00:04:32,530
Johnson Reed Immigration Act in the 1900s
where the the US took census data from
46
00:04:32,530 --> 00:04:38,750
before a big wave of immigration putting a
quota system in place that essentially
47
00:04:38,750 --> 00:04:44,630
prevented people from eastern and southern
Europe from coming to the United States.
48
00:04:44,630 --> 00:04:51,770
This history of discrimination and racism
continues to today. Many of you, I'm sure
49
00:04:51,770 --> 00:04:57,480
have heard of what's happening now with
the so-called Muslim ban where a list of
50
00:04:57,480 --> 00:05:01,840
seven countries are now blacklisted.
Immigrants are unable to enter the
51
00:05:01,840 --> 00:05:11,590
country. And this is just another data
point to show the trend that our discourse
52
00:05:11,590 --> 00:05:20,789
and immigration policy in the United
States is often racialized. I want to talk
53
00:05:20,789 --> 00:05:23,220
a little bit about what immigration
enforcement actually looks like in the
54
00:05:23,220 --> 00:05:30,050
United States. The agency that manages
enforcement is called the US Immigration
55
00:05:30,050 --> 00:05:39,639
and Customs Enforcement or ICE. They're in
charge of enforcing within the borders
56
00:05:39,639 --> 00:05:44,270
once people have already entered the
country, finding people without
57
00:05:44,270 --> 00:05:50,140
documentation or managing immigration
cases. Over the last couple of decades
58
00:05:50,140 --> 00:05:56,220
they've really been gaining in size and
power. This is anything from the removal
59
00:05:56,220 --> 00:06:03,330
of privacy restrictions on sharing data
between federal agencies to an increase in
60
00:06:03,330 --> 00:06:10,530
financial resources after 9/11. And this
is happening even today. President Trump
61
00:06:10,530 --> 00:06:16,360
back in January had an executive order
that is looking to add another 5,000
62
00:06:16,360 --> 00:06:21,420
agents to their current 20,000 over the
next couple of years. So this is an agency
63
00:06:21,420 --> 00:06:26,710
that's continuing should be bolstered. And
another way that they're changing,
64
00:06:26,710 --> 00:06:32,870
recently, is the way that they're
integrating technology into their jobs.
65
00:06:32,870 --> 00:06:37,560
This photo in particular shows a
fingerprint scanner. The collection of
66
00:06:37,560 --> 00:06:42,865
biometric data is becoming really common
in immigration enforcements. And it's not
67
00:06:42,865 --> 00:06:48,310
just when someone's taken into an
immigration office but mobile fingerprint
68
00:06:48,310 --> 00:06:52,480
scanners are being taken into communities.
There are stories of people having their
69
00:06:52,480 --> 00:06:58,440
biometric data taken, even without arrest.
Being stopped in the street or being near
70
00:06:58,440 --> 00:07:03,310
someone who's being detained for a
particular reason. Everyone in the area or
71
00:07:03,310 --> 00:07:09,370
everyone in the household having their
biometric data taken. We've also seen the
72
00:07:09,370 --> 00:07:13,370
removal of some restrictions on how this
data can be shared between federal
73
00:07:13,370 --> 00:07:21,820
agencies. In particular President Trump
has reinstated the Secure Communities
74
00:07:21,820 --> 00:07:27,230
Program which allows local police officers
when they're booking people for local
75
00:07:27,230 --> 00:07:33,300
crimes or in local jails to take biometric
data and cross-check it against federal
76
00:07:33,300 --> 00:07:47,220
immigration databases and crime databases.
We're also seeing evidence that,... So
77
00:07:47,220 --> 00:07:50,530
DHS, is the Department of Homeland
Security the umbrella organization over
78
00:07:50,530 --> 00:07:59,340
ICE. We have recently seen through a
Freedom of Information request that this
79
00:07:59,340 --> 00:08:05,430
organization has used cell-site simulators
or stingrays over 1,800 times in the last
80
00:08:05,430 --> 00:08:11,960
five years. We don't know all of the cases
where these have been used. And we really
81
00:08:11,960 --> 00:08:16,650
can't speculate these cases are shrouded
in secrecy and we don't know when and how
82
00:08:16,650 --> 00:08:20,620
they're being used. But we do have one
case, it's actually close to my home in
83
00:08:20,620 --> 00:08:27,580
Detroit Michigan where an undocumented
man, ICE was able to send a warrant to
84
00:08:27,580 --> 00:08:32,948
Facebook to get his phone number and then
use that phone number with a cell site
85
00:08:32,948 --> 00:08:41,729
simulator to track him to his home and
ended up deporting him to El Salvador.
86
00:08:41,729 --> 00:08:46,939
We're also seeing this move to start
collecting social media data at the
87
00:08:46,939 --> 00:08:52,439
borders. This isn't just for people on
temporary visas but also nationlised
88
00:08:52,439 --> 00:09:00,379
citizens and people with permanent
residency cards. This might not be so
89
00:09:00,379 --> 00:09:03,790
relevant to people who are already in the
country because they're not crossing the
90
00:09:03,790 --> 00:09:09,249
border regularly, but this might be
impactful if they have friends and family
91
00:09:09,249 --> 00:09:14,660
crossing borders to visit them. And new
immigrants as well. This is a database
92
00:09:14,660 --> 00:09:21,130
that we don't really know what it's being
used for yet. But there are some hints in
93
00:09:21,130 --> 00:09:27,500
the way that, for example, ICE has been
soliciting contracts from big data
94
00:09:27,500 --> 00:09:32,610
companies to create algorithms to do this
extreme vetting to be able to find
95
00:09:32,610 --> 00:09:40,419
suspicious activity or suspicious people
from troves of social media data. In fact
96
00:09:40,419 --> 00:09:44,519
we have already seen some of these
contracts being awarded. There was a 3
97
00:09:44,519 --> 00:09:50,980
million contract recently given to a
company called Giant Oak who claims to
98
00:09:50,980 --> 00:09:57,550
take big data and find bad guys. Their
creepy slogans, "We see the people behind
99
00:09:57,550 --> 00:10:07,269
the data" 'trademark'. And this is just
another example of the way that technology
100
00:10:07,269 --> 00:10:13,629
is being used to... in ways that are sort
of unpredictable at this point but
101
00:10:13,629 --> 00:10:21,610
we have many examples where this
style of research can often be
102
00:10:21,610 --> 00:10:29,660
discriminatory. And it might be expected
that at this point in time technologies
103
00:10:29,660 --> 00:10:34,040
ending up integrated into law enforcement
in the way that it's being integrated into
104
00:10:34,040 --> 00:10:38,129
a lot of different parts of our lives. But
there's a reason that this moment in
105
00:10:38,129 --> 00:10:44,240
particular is so frightening. This
administration's making it abundantly
106
00:10:44,240 --> 00:10:50,139
clear what they think of immigration. Just
in less than a year so far we've seen the
107
00:10:50,139 --> 00:10:54,470
repeal of the deferred action for
Childhood Arrivals Program which you might
108
00:10:54,470 --> 00:11:00,839
also hear as the DREAM Act or people here
talking about Dreamers. This is a program
109
00:11:00,839 --> 00:11:05,220
that allowed people who entered the
country under the age of 16 to get work
110
00:11:05,220 --> 00:11:11,820
permits and driver licenses and attend
university and have their immigration
111
00:11:11,820 --> 00:11:18,639
cases delayed so long as they're meeting
educational goals. We've seen the
112
00:11:18,639 --> 00:11:27,869
elimination of privacy protections from
sharing data between federal agencies. And
113
00:11:27,869 --> 00:11:31,790
in addition to the actual concrete policy
changes, we're hearing a lot of really
114
00:11:31,790 --> 00:11:37,079
nasty rhetoric around immigrants and
immigration. That's causing a lot of
115
00:11:37,079 --> 00:11:41,790
concern among people who are in the
immigrant community or who are allies to
116
00:11:41,790 --> 00:11:46,790
the immigrant community about what this
means in terms of harassment and hatred
117
00:11:46,790 --> 00:11:55,350
even beyond the the legal changes. We're
also seeing a change in deportation
118
00:11:55,350 --> 00:12:04,990
practices while Obama was prolific in
deportations. He had a very explicit
119
00:12:04,990 --> 00:12:10,429
policy in place that the priority
deportations would be people who were
120
00:12:10,429 --> 00:12:14,089
national security threats whatever that
might mean, or people with serious
121
00:12:14,089 --> 00:12:19,535
criminal records, or people who had just
recently entered the United States. That
122
00:12:19,535 --> 00:12:23,209
policy is being removed and we're seeing
more and more people who are deported
123
00:12:23,209 --> 00:12:31,350
after living in the United States for a
long time with family and friends and
124
00:12:31,350 --> 00:12:36,199
lives built in the communities; who might
have family or children who are US
125
00:12:36,199 --> 00:12:47,039
citizens who don't have criminal records.
So what does this mean for Anna Maria? For
126
00:12:47,039 --> 00:12:51,269
one without a criminal record. She
previously might have been able to have
127
00:12:51,269 --> 00:12:54,739
some high amount of confidence that she
wouldn't be a priority target and that
128
00:12:54,739 --> 00:13:02,399
confidence is being eroded. We're
seeing lots of people who previously
129
00:13:02,399 --> 00:13:10,709
wouldn't have been targeted be deported
regardless of their clean record, and lack
130
00:13:10,709 --> 00:13:17,420
of action that really makes them more
visible than they have been in the past.
131
00:13:17,420 --> 00:13:21,259
She and her husband are starting to think
about, what happens to their children if
132
00:13:21,259 --> 00:13:25,119
they're deported. They have to make the
decision because the children were born in
133
00:13:25,119 --> 00:13:29,949
the United States, they're US citizens.
They have to decide whether they should
134
00:13:29,949 --> 00:13:33,939
give custody to friends and family who can
stay in the United States, or if they
135
00:13:33,939 --> 00:13:38,430
should take them back to Mexico, rather
than letting them stay and get the US
136
00:13:38,430 --> 00:13:44,000
education that they want to have. She has
to be concerned about ICE being in her
137
00:13:44,000 --> 00:13:48,589
community and outside of her home.
Possibly having her fingerprints taken if
138
00:13:48,589 --> 00:13:53,029
she's in the wrong place at the wrong
time. She might have to worry about
139
00:13:53,029 --> 00:13:57,730
friends and family from Mexico visiting,
and crossing the border, and having social
140
00:13:57,730 --> 00:14:04,249
media data taken from them. That, I mean,
as we all know, might indicate a lot more
141
00:14:04,249 --> 00:14:10,649
than just about the person who's crossing
the border. Our social media lives give a
142
00:14:10,649 --> 00:14:17,000
lot of information about her networks that
might expose information about her. It's
143
00:14:17,000 --> 00:14:20,459
also worth noting that Anna Maria is far
from alone. There are as many as 11
144
00:14:20,459 --> 00:14:25,049
million undocumented immigrants in the
United States today. Over 2/3 of them have
145
00:14:25,049 --> 00:14:29,000
been in the United States for more than 10
years which means they're integrated into
146
00:14:29,000 --> 00:14:35,059
our communities, they own houses, they
have jobs, they pay taxes, they live
147
00:14:35,059 --> 00:14:40,410
really normal lives to the extent that
they can in the United States. They've
148
00:14:40,410 --> 00:14:48,629
built their lives here. So with this
context in mind, I and some of my
149
00:14:48,629 --> 00:14:53,809
collaborators were wondering, how this is
really changing the way that people use
150
00:14:53,809 --> 00:15:00,629
technology? Or if it is, given the sort of
objectively heightened risk that they're
151
00:15:00,629 --> 00:15:04,899
facing day to day. We wanted to know
whether or not there's any sort of
152
00:15:04,899 --> 00:15:14,549
reaction to those changes happening in
their daily lives. We reached out to some
153
00:15:14,549 --> 00:15:18,879
immigration support organizations, so
immigrant rights and activist's
154
00:15:18,879 --> 00:15:25,820
organizations and worked with them to be
able to communicate with this community.
155
00:15:25,820 --> 00:15:31,179
In the end, we were able to talk to 17
undocumented immigrants in the Midwest. We
156
00:15:31,179 --> 00:15:37,129
were primarily asking them about how they
manage risk in their daily lives offline,
157
00:15:37,129 --> 00:15:42,529
as well as online. And whether or not
that's changing over the last year or two
158
00:15:42,529 --> 00:15:46,499
years, when this discourse around
immigration is really changing, and then
159
00:15:46,499 --> 00:15:53,980
whether these changes that we're seeing,
are causing them to maybe react in the way
160
00:15:53,980 --> 00:16:00,570
that they're using technology. I can tell
you a little bit about who we spoke to.
161
00:16:00,570 --> 00:16:07,039
The majority were women, 14 of our 17
participants were women. Most of them were
162
00:16:07,039 --> 00:16:12,809
in their mid 30s, average age 35. And lots
of them had children. So it was a lot of
163
00:16:12,809 --> 00:16:17,470
parents. Everyone that we spoke to, had
been in the United States for more than 10
164
00:16:17,470 --> 00:16:23,189
years. So they really had their lives and
their communities here. And most of them
165
00:16:23,189 --> 00:16:26,999
were also from Mexico. That's about
consistent with the immigrant community in
166
00:16:26,999 --> 00:16:33,684
the United States, especially from Latin
America. The majority are from Mexico. And
167
00:16:33,684 --> 00:16:37,239
then there was a mix of immigration
stories. Some of the people we spoke to
168
00:16:37,239 --> 00:16:43,209
had crossed the southern border by foot or
otherwise. And some people had overstayed
169
00:16:43,209 --> 00:16:53,120
visas, had flown to the United States and
stayed. So we wanted to first get an idea
170
00:16:53,120 --> 00:16:57,290
of how they're managing and sort of
thinking about risk in their daily lives
171
00:16:57,290 --> 00:17:05,409
offline to get a sense of how deeply it
impacts the way that they're living. What
172
00:17:05,409 --> 00:17:10,410
we found across the board is that
immigration is a really sort of looming
173
00:17:10,410 --> 00:17:15,609
presence in their lives. They think a lot
about how they're exposing themselves, and
174
00:17:15,609 --> 00:17:22,199
that possibly exposing their status to
authority figures. And they put like a lot
175
00:17:22,199 --> 00:17:29,980
of careful consideration into how to keep
a low profile. Driving is one really good
176
00:17:29,980 --> 00:17:37,950
example of this cost-risk cost-benefit
analysis that they're doing. Most people
177
00:17:37,950 --> 00:17:40,809
we spoke to you talked about driving one
way or another, and about half chose to
178
00:17:40,809 --> 00:17:47,210
drive and half chose not to. Most of the
people don't have driver's licenses for
179
00:17:47,210 --> 00:17:51,510
the United States because it's difficult
to get them without legal immigration
180
00:17:51,510 --> 00:17:57,940
papers. So the risk with driving is that
if you're stopped, if you're pulled over,
181
00:17:57,940 --> 00:18:02,100
even if you didn't have a traffic
violation, if you stop for a taillight or
182
00:18:02,100 --> 00:18:06,428
something. The routine is to ask for a
documentation of your license. And if you
183
00:18:06,428 --> 00:18:09,149
don't have that there might be more
questions, and in the end, you could
184
00:18:09,149 --> 00:18:17,179
expose yourself to immigration or other
legal law enforcement. Some people really
185
00:18:17,179 --> 00:18:23,300
thought that the risk was worth it. To
live their lives how they want to. They're
186
00:18:23,300 --> 00:18:26,679
going to try to just not think about the
risk and do what they need to do day to
187
00:18:26,679 --> 00:18:33,269
day. Other people felt that the risk was
too great and chose not to drive at all.
188
00:18:33,269 --> 00:18:36,600
And that's a significant sacrifice,
especially in the United States where our
189
00:18:36,600 --> 00:18:40,950
public transportation systems aren't
fantastic. This might mean that they can't
190
00:18:40,950 --> 00:18:44,121
set their own work schedules, or they
can't take their kids to school if they
191
00:18:44,121 --> 00:18:49,750
miss the bus. So it's a significant risk.
But it's also a big sacrifice if they
192
00:18:49,750 --> 00:18:56,769
choose not to drive. People also think a
lot about how they're exposing themselves
193
00:18:56,769 --> 00:19:03,835
to authority figures. As one example, the
decision to file taxes or not is a big
194
00:19:03,835 --> 00:19:09,980
risk. So in the United States, you don't
need to have any sort of government ID to
195
00:19:09,980 --> 00:19:17,309
file taxes, you just need a tax ID. So a
lot of these people are filing taxes. But
196
00:19:17,309 --> 00:19:21,769
in order to do that, they are giving up to
the federal government their names, their
197
00:19:21,769 --> 00:19:27,410
addresses, their employment history,
contact information. And some people think
198
00:19:27,410 --> 00:19:32,880
that that risk is worth it, right. Because
this person for example feels like, by
199
00:19:32,880 --> 00:19:40,980
paying taxes every year they're able to
establish a good history of upstanding
200
00:19:40,980 --> 00:19:45,389
behavior. They can maybe have a better
case for getting a legal status if the
201
00:19:45,389 --> 00:19:56,950
time comes, when that's an option. And
another example of, you know, exposing
202
00:19:56,950 --> 00:20:02,409
information to authorities, might be
filing for benefits for US born children,
203
00:20:02,409 --> 00:20:09,019
or even library cards, or local ID cards.
And the risk is going to be different in
204
00:20:09,019 --> 00:20:13,840
each case depending on what they're
exposing. Some people chose to forego
205
00:20:13,840 --> 00:20:20,519
significant benefits to avoid giving that
information to authorities. This person is
206
00:20:20,519 --> 00:20:25,997
talking about DACA, the deferred action
for childhood arrival program. This would
207
00:20:25,997 --> 00:20:30,661
make it much easier for their son to go to
college, give their son hopefully if they
208
00:20:30,661 --> 00:20:36,282
trust the program, a much more reliable
immigration status. They wouldn't
209
00:20:36,282 --> 00:20:41,210
technically have a legal immigration
status but they would be sort of assured
210
00:20:41,210 --> 00:20:45,882
that their status, or rather their
immigration case is a low priority. They
211
00:20:45,882 --> 00:20:49,600
wouldn't be targeted. And as long as
they're attending universities, they could
212
00:20:49,600 --> 00:20:56,180
have confidence. So the program says that
they wouldn't be targeted. These people
213
00:20:56,180 --> 00:21:00,669
were concerned because in order to file
that paperwork for their son, they had to
214
00:21:00,669 --> 00:21:03,564
give up a lot of information about
themselves: their phone numbers, their
215
00:21:03,564 --> 00:21:09,780
names, their addresses. And in the end,
they decided not to do it. And
216
00:21:09,780 --> 00:21:14,333
unfortunately, only weeks after we spoke
to this person, the DACA program was
217
00:21:14,333 --> 00:21:19,240
repealed. This has led a lot of people to
be concerned because the people who did
218
00:21:19,240 --> 00:21:23,130
apply for the program, have given that
information to the government, to the
219
00:21:23,130 --> 00:21:27,889
Immigration services in particular. And at
this point in time, we have no assurances
220
00:21:27,889 --> 00:21:32,930
that that information won't be used in
immigration cases. At the moment, there's
221
00:21:32,930 --> 00:21:38,690
just a sort of FAQ page that says, we
don't use this information now but we
222
00:21:38,690 --> 00:21:47,390
reserve the right to change that at any
time without telling anyone. People are
223
00:21:47,390 --> 00:21:51,639
also really feeling the changes that are
happening in the last couple of months.
224
00:21:51,639 --> 00:21:57,539
Well, it's been too many months, the last
year and a half. They're feeling the
225
00:21:57,539 --> 00:22:02,691
pressure in their communities for
immigration services being, or immigration
226
00:22:02,691 --> 00:22:09,679
enforcement being more present and less
predictable. Of one person described
227
00:22:09,679 --> 00:22:13,399
feeling like, instead of coming to take a
particular person, they're just coming and
228
00:22:13,399 --> 00:22:19,020
looking for anyone who might be
undocumented. Many people that we spoke
229
00:22:19,020 --> 00:22:24,269
to, had negative experiences with ICE.
Including,... if it weren't,... if they
230
00:22:24,269 --> 00:22:27,559
hadn't had to experience themselves, lots
of people had friends and family who had
231
00:22:27,559 --> 00:22:32,669
negative experiences. And they're feeling
this increase in presence of enforcement
232
00:22:32,669 --> 00:22:38,070
in their communities. And this is leading
them to make significant changes to the
233
00:22:38,070 --> 00:22:43,080
way that they're living their lives. For
example, one person we spoke to talked
234
00:22:43,080 --> 00:22:47,080
about how they won't leave their child at
home alone anymore because they're worried
235
00:22:47,080 --> 00:22:51,559
that, while they're out, their child; if
they're picked up while they're out, and
236
00:22:51,559 --> 00:22:56,220
the child's at home alone, they might be
left there. Or ICE might even show up at
237
00:22:56,220 --> 00:22:58,963
the house while the child's there alone.
They don't want either of those things to
238
00:22:58,963 --> 00:23:07,030
happen. So people are changing a lot of
the ways that they live day to day. And
239
00:23:07,030 --> 00:23:12,690
this is a very present concern, in the way
that they talk about their daily lives. So
240
00:23:12,690 --> 00:23:15,590
we were wondering if this is true when
they think about the way that they use
241
00:23:15,590 --> 00:23:22,480
technology and what they're doing online.
First, let me just give you an overview of
242
00:23:22,480 --> 00:23:27,270
what sort of technologies they primarily
use. This community is really mobile
243
00:23:27,270 --> 00:23:31,889
heavy. Some people had computers in the
home. A lot of people had access to
244
00:23:31,889 --> 00:23:35,380
computers through local libraries and
things. But everyone had a smartphone and
245
00:23:35,380 --> 00:23:40,460
they were very dependent on it. Some
people used email but when they spoke
246
00:23:40,460 --> 00:23:47,590
about email, it was mostly to do with
communicating with their kids schools or
247
00:23:47,590 --> 00:23:51,350
doctor's appointments. It wasn't really a
social thing. So the majority of what we
248
00:23:51,350 --> 00:23:57,080
spoke to people about, were social media
tools. In particular, all but one of our
249
00:23:57,080 --> 00:24:03,429
participants were active users of
Facebook. Most people were using WhatsApp
250
00:24:03,429 --> 00:24:09,520
and Facebook Messenger, as well. These are
the three primary tools that people had
251
00:24:09,520 --> 00:24:16,120
the most to say about. There were some
other tools that they were on: Instagram,
252
00:24:16,120 --> 00:24:23,070
Twitter, and Snapchat. But really, the
overarching, sort of a sense that people
253
00:24:23,070 --> 00:24:26,289
had about these tools is that it's
bringing significant benefits to their
254
00:24:26,289 --> 00:24:31,200
daily lives. Especially, when you think
about this community being separated
255
00:24:31,200 --> 00:24:36,610
permanently from a lot of their friends
and family back home, or their former
256
00:24:36,610 --> 00:24:42,500
home, their origin country. What they had
to do before, maybe sending photos in the
257
00:24:42,500 --> 00:24:46,610
mail or through post cards, buying
international calling cards, being able to
258
00:24:46,610 --> 00:24:50,779
call people with video chat now is a
significant improvement to their ability
259
00:24:50,779 --> 00:24:56,450
to keep in touch with people back in
Mexico or in wherever their... the origin
260
00:24:56,450 --> 00:25:02,990
country is. People also talked about, how
it's improving their lives in other ways.
261
00:25:02,990 --> 00:25:06,750
For example, being able to organize their
own work schedules, and have more control
262
00:25:06,750 --> 00:25:12,179
over the way that they're employed. The
benefits go on and on, and it's a lot of
263
00:25:12,179 --> 00:25:15,429
the same things that we've experienced
over the last decade, and the way that our
264
00:25:15,429 --> 00:25:21,440
lives have changed for the better. Because
we're able to use these technologies. When
265
00:25:21,440 --> 00:25:26,162
we ask people about risk, the things that
really pop into their heads first, are
266
00:25:26,162 --> 00:25:32,581
hackers. They're really concerned about
fraud and identity theft. And they think a
267
00:25:32,581 --> 00:25:37,700
lot about their children contacting
strangers on the internet, or accessing
268
00:25:37,700 --> 00:25:47,679
inappropriate content. But that's not to
say that concerns related to their status,
269
00:25:47,679 --> 00:25:57,620
their illegal status were absent. They're
just much less certain. You know, it's
270
00:25:57,620 --> 00:26:03,539
easy to think about the consequences of
identity theft. That's sort of concrete.
271
00:26:03,539 --> 00:26:12,379
But a lot of these status related concerns
were less concrete. People talked about
272
00:26:12,379 --> 00:26:17,399
harassment as well, being something that's
increasing in the real world, as well as
273
00:26:17,399 --> 00:26:28,049
online. In particular participating in
communities, or in conversations online
274
00:26:28,049 --> 00:26:33,639
that may be expose their immigration
status. This harassment has moved online.
275
00:26:33,639 --> 00:26:38,620
They're experiencing it in the real world,
as well, but they're hearing stories or
276
00:26:38,620 --> 00:26:42,039
having stories themselves about people
threatening them with immigration
277
00:26:42,039 --> 00:26:54,049
enforcement. That's increasing over the
last year or so. There are a couple of
278
00:26:54,049 --> 00:27:00,860
ways that people manage these risks.
Primarily, what we found people really
279
00:27:00,860 --> 00:27:07,029
thought about, is their concrete steps to
managing their privacy online were fairly
280
00:27:07,029 --> 00:27:11,671
basic things like, making sure that they
only accept friends and family on
281
00:27:11,671 --> 00:27:19,580
Facebook. They might have set their
profile to private. But they're really not
282
00:27:19,580 --> 00:27:24,159
fiddling with these more fine-grained
privacy settings. They're not, you know,
283
00:27:24,159 --> 00:27:27,942
sharing particular posts only to
particular people, or using that. They
284
00:27:27,942 --> 00:27:30,789
were talking about, they didn't tell us
about using these, like private groups or
285
00:27:30,789 --> 00:27:40,419
anything like that to sort of create
separate spheres of friends and family.
286
00:27:40,419 --> 00:27:45,179
And channel management, just in the sense
that like, even though they think about
287
00:27:45,179 --> 00:27:49,309
curating this, like close network of
friends and family, they're still really
288
00:27:49,309 --> 00:27:55,280
thoughtful about what they post in which
channel. Whether like it's safe to put a
289
00:27:55,280 --> 00:27:59,159
photo, for example on their wall, or you
know, in their timeline versus sending it
290
00:27:59,159 --> 00:28:07,210
directly to family. This person, for
example, even after they post something
291
00:28:07,210 --> 00:28:13,710
publicly, publicly being, you know. within
their Facebook wall, they'll still go back
292
00:28:13,710 --> 00:28:16,789
to a couple days later and just delete
everything because they're not totally
293
00:28:16,789 --> 00:28:27,820
confident that that's private. Another
really interesting thing is that in all of
294
00:28:27,820 --> 00:28:32,150
this, the conversations we had, no one
really expressed the sense that they
295
00:28:32,150 --> 00:28:39,259
understood that they're really living on
Facebook. The tools that they're using
296
00:28:39,259 --> 00:28:46,679
like almost exclusively, are all owned by
the same company. No one also express any
297
00:28:46,679 --> 00:28:51,429
sort of sense that these companies are
entities in themselves that might have
298
00:28:51,429 --> 00:28:56,950
interest in access to their data. Much
less one that cooperates with law
299
00:28:56,950 --> 00:29:05,190
enforcement. That concern didn't appear in
any of our conversations. They tend to
300
00:29:05,190 --> 00:29:09,789
think about these platforms as being sort
of a medium to communicate with other
301
00:29:09,789 --> 00:29:17,340
people. You know, the way that they use
it, is to talk to other individuals, or
302
00:29:17,340 --> 00:29:21,300
groups of individuals. But the platform
doesn't seem to be like a repository for
303
00:29:21,300 --> 00:29:28,409
data. In fact, they are expressing
significant trust in Facebook, Facebook in
304
00:29:28,409 --> 00:29:32,510
particular. A lot of people were grateful
for the changes that Facebook's made over
305
00:29:32,510 --> 00:29:40,289
the last year or two, in terms of account
management. So they're grateful that if
306
00:29:40,289 --> 00:29:44,169
there's a suspicious login attempt,
they'll be able to stop it. That's helped
307
00:29:44,169 --> 00:29:48,659
a lot of people. And that sort of
generates trust in these platforms. And
308
00:29:48,659 --> 00:30:01,370
the sense that Facebook really has their
back. In addition to sort of managing the
309
00:30:01,370 --> 00:30:06,129
way that they're sharing information, we
did see some people choosing to abstain
310
00:30:06,129 --> 00:30:12,049
from sharing. Especially, when it came to
topics around immigration. Some people
311
00:30:12,049 --> 00:30:18,320
chose to not join, you know, public
Facebook groups, or get information from
312
00:30:18,320 --> 00:30:22,040
certain places because they were afraid
that by associating with these groups,
313
00:30:22,040 --> 00:30:32,529
they might indicate something publicly
about their status. And that's frustrating
314
00:30:32,529 --> 00:30:35,080
for a lot of people who want to
participate in these conversations, and
315
00:30:35,080 --> 00:30:38,830
especially, because the discourse around
immigration is so toxic in the United
316
00:30:38,830 --> 00:30:45,750
States. Some people express this feeling
that they have to just sit there and take
317
00:30:45,750 --> 00:30:51,010
this discourse happening around them
without participating, because they're
318
00:30:51,010 --> 00:30:57,019
worried about being targeted, or harassed,
or maybe even like having physical
319
00:30:57,019 --> 00:31:00,529
consequences: being followed, or having
immigration sent to their house if someone
320
00:31:00,529 --> 00:31:09,129
were to find them. Some people expressed
the opposite, though, which is
321
00:31:09,129 --> 00:31:16,240
encouraging, right? Some people felt that,
even though the risk is there, it's more
322
00:31:16,240 --> 00:31:20,399
important for them to share their thoughts
than it is for them to be tiptoeing around
323
00:31:20,399 --> 00:31:27,929
immigration enforcement. This is also
really interesting because this sort of
324
00:31:27,929 --> 00:31:34,289
exposes sometimes family tensions about
these topics. This is a really, it's a
325
00:31:34,289 --> 00:31:37,720
mixed status community, meaning that
sometimes parents will be undocumented and
326
00:31:37,720 --> 00:31:42,690
children will be US citizens. Or lots of
people have friends and family who have a
327
00:31:42,690 --> 00:31:47,652
different legal status than they do. So
risk is really distributed. You know, it's
328
00:31:47,652 --> 00:31:52,410
not just individual, it's within families
and within communities. And there can be a
329
00:31:52,410 --> 00:31:57,049
lot of tension between, you know, children
and parents, or friends, you know,
330
00:31:57,049 --> 00:32:01,210
siblings, about how they share information
on these platforms. Some people are much
331
00:32:01,210 --> 00:32:09,889
more conservative with what they share.
And this quote also reveals something else
332
00:32:09,889 --> 00:32:17,340
kind of interesting. When we talk to
people about concerns about immigration,
333
00:32:17,340 --> 00:32:21,679
it's very rarely that they talk about
whether immigration will be able to
334
00:32:21,679 --> 00:32:27,980
investigate them, as much as it is about
when, which is this final point that
335
00:32:27,980 --> 00:32:33,809
there's really this sense of resignation
in the community about what information
336
00:32:33,809 --> 00:32:43,919
immigration enforcement has about them.
Lots of people feel like, it doesn't
337
00:32:43,919 --> 00:32:50,600
really matter what they do. Immigration
can know where they are and what they're
338
00:32:50,600 --> 00:32:55,230
doing. They can find them if they just
decide to. It's just a matter of whether
339
00:32:55,230 --> 00:32:59,269
immigration enforcement is going to choose
to come after them, rather than whether
340
00:32:59,269 --> 00:33:08,670
they can. This is also true with the way
that they think about technology. They
341
00:33:08,670 --> 00:33:15,639
have a sense that there's really no
privacy. If immigration decided to, they
342
00:33:15,639 --> 00:33:20,389
would be able to see the messages on
Facebook, they could see what was
343
00:33:20,389 --> 00:33:25,559
physically on their phones, that they have
this sort of all-powerful, you know,
344
00:33:25,559 --> 00:33:31,539
toolkit to access their digital
information. And honestly, this story in
345
00:33:31,539 --> 00:33:39,178
particular, this sense of surveillance
comes from experience often. This person
346
00:33:39,178 --> 00:33:44,190
had a really negative experience with ICE,
you know, coming and talking to her
347
00:33:44,190 --> 00:33:49,309
family. And ICE knowing things that they
hadn't told anyone. Somehow ICE had known
348
00:33:49,309 --> 00:33:53,110
things that they were keeping very
private. And so there's this assumption
349
00:33:53,110 --> 00:33:56,570
that, well, it's happened to me before,
I've seen it happen to my friends, they
350
00:33:56,570 --> 00:34:08,139
probably could know anything they want to.
But it's not all negative, it's not all
351
00:34:08,139 --> 00:34:14,000
resignation. Another thing that we saw,
many people, not everyone, but maybe half
352
00:34:14,000 --> 00:34:16,389
of the people we spoke to, had this really
strong sense that there was this
353
00:34:16,389 --> 00:34:21,230
responsibility to share things in the
community to help each other. There's this
354
00:34:21,230 --> 00:34:29,290
growing sense of community identity. And
this might mean sharing information about
355
00:34:29,290 --> 00:34:34,650
resources for the immigrant community or
sharing information about workshops, or
356
00:34:34,650 --> 00:34:41,440
events, vigils, but also information about
immigration enforcement. If ICE is in a
357
00:34:41,440 --> 00:34:45,800
particular community, they might tell
their friends and family, avoid this area
358
00:34:45,800 --> 00:34:50,440
until further notice. They're helping each
other, they're sending information. So, it
359
00:34:50,440 --> 00:34:54,210
can't be total resignation. There's still
this sort of beam of hope that they're
360
00:34:54,210 --> 00:34:57,600
helping each other. And they must have
hope that they can do something because
361
00:34:57,600 --> 00:35:03,260
they are. And this has been something that
has become faster and easier with
362
00:35:03,260 --> 00:35:08,680
technology, too, right? It's much easier
to send a message than it is to call, or
363
00:35:08,680 --> 00:35:17,270
to spread information before we had, you
know, smartphones. But all of this really
364
00:35:17,270 --> 00:35:20,650
leads to the question: Considering how
much they inconvenience themselves in
365
00:35:20,650 --> 00:35:25,200
their daily lives offline, why are they
doing comparatively little online to
366
00:35:25,200 --> 00:35:32,700
change their practices, or to reduce their
visibility? I don't think it's enough
367
00:35:32,700 --> 00:35:38,510
that, although lots of people expressed
this sense that they're like relatively
368
00:35:38,510 --> 00:35:46,370
low-tech literate. That in and of itself
isn't really enough of an explanation,
369
00:35:46,370 --> 00:35:50,490
right? There are so many different factors
into the way that they're making these
370
00:35:50,490 --> 00:35:55,600
decisions, and they're thinking carefully
about the decisions they do make. So we
371
00:35:55,600 --> 00:36:00,660
have some thoughts on this. It really
can't be understated how much of a benefit
372
00:36:00,660 --> 00:36:05,540
technology is to this community. It's
making a significant difference in the way
373
00:36:05,540 --> 00:36:14,869
that they live their lives. So the choice
to abstain is not trivial. The risk that
374
00:36:14,869 --> 00:36:19,080
they're facing by using like Facebook, by
putting phone numbers on Facebook, or
375
00:36:19,080 --> 00:36:23,760
sharing photos of their family and
friends, and like, building these online
376
00:36:23,760 --> 00:36:29,650
networks, is, really the risk involved in
that is uncertain, right? At this point we
377
00:36:29,650 --> 00:36:34,580
have really sparse data about direct
connections between the use of technology,
378
00:36:34,580 --> 00:36:39,670
or the use of social media and immigration
enforcement, and consequences. Maybe that
379
00:36:39,670 --> 00:36:43,640
will change, but at this point it's
unclear which changes might be actually
380
00:36:43,640 --> 00:36:48,860
beneficial, right? Because there's not a
direct connection between using this tool,
381
00:36:48,860 --> 00:36:55,500
putting this information online, and
immigration enforcement showing up.
382
00:36:55,500 --> 00:37:00,770
There's also the significant trust in the
platforms that they're using and their
383
00:37:00,770 --> 00:37:08,080
peers are using as well and there just
tends to be less critical thought about
384
00:37:08,080 --> 00:37:13,620
the safety of using platforms when there's
already this component of trust. Facebook
385
00:37:13,620 --> 00:37:18,830
has done a lot for account security for
example over the last couple of years and
386
00:37:18,830 --> 00:37:25,470
has built trust in this community. And as
well as having you know all of your
387
00:37:25,470 --> 00:37:30,470
community on a tool when they're all there
together there's like less of a, less
388
00:37:30,470 --> 00:37:36,930
critical thought about whether they're
it's safe to be there. And there is this
389
00:37:36,930 --> 00:37:41,730
component of resignation when we've sort
of pushed people to think really
390
00:37:41,730 --> 00:37:46,900
explicitly about the risk with immigration
enforcement, being in sharing information
391
00:37:46,900 --> 00:37:53,840
on social media using technology there was
the sense that if they wanted to - they
392
00:37:53,840 --> 00:37:57,130
could have the information, I mean, they
already have it in a lot of ways when
393
00:37:57,130 --> 00:38:04,160
they're filing taxes or just you know it's
accessible to authorities is the general
394
00:38:04,160 --> 00:38:08,970
sense of regardless of what they do
online. So this kind of in combination
395
00:38:08,970 --> 00:38:13,850
with the uncertain risk it makes it really
hard to make concrete steps towards
396
00:38:13,850 --> 00:38:25,060
changes that might be helpful. So finally,
I just wanted to share a couple of things
397
00:38:25,060 --> 00:38:34,390
that I learned especially as a digital
security trainer and doing this study.
398
00:38:34,390 --> 00:38:41,970
Most importantly everyone that we spoke to
was really excited to learn. That's just
399
00:38:41,970 --> 00:38:47,100
general like tech literacy but also
security and privacy. People really care
400
00:38:47,100 --> 00:38:52,310
and they're excited. And everyone
expressed gratitude that we were talking
401
00:38:52,310 --> 00:39:00,340
to them about this topic. They care a lot.
But so what was difficult for me having a
402
00:39:00,340 --> 00:39:06,570
background in trainings was still being
surprised by things that in these
403
00:39:06,570 --> 00:39:12,670
conversations that thinking I knew what
they wanted or what they needed and that
404
00:39:12,670 --> 00:39:17,400
not being the case. So one thing I would
say is you know don't assume that you know
405
00:39:17,400 --> 00:39:22,770
what's best for them or even what they
want or need. Go and talk to people
406
00:39:22,770 --> 00:39:27,020
they're really you'll learn a lot from
talking to people about what they think
407
00:39:27,020 --> 00:39:32,870
their risk is versus what they're doing.
For example something that I was surprised
408
00:39:32,870 --> 00:39:37,180
to learn is that they're really not using
online resources when they have concerns
409
00:39:37,180 --> 00:39:42,240
about online privacy. They're talking to
their kids and they're talking to their
410
00:39:42,240 --> 00:39:47,570
neighbors and their friends. So for this
community in particular it would be really
411
00:39:47,570 --> 00:39:53,170
much more effective to go into an in-
person training. A training in Spanish in
412
00:39:53,170 --> 00:39:59,070
this case. In the language that they're
naturally speaking and have like in-person
413
00:39:59,070 --> 00:40:05,180
resources that will get you much further
than you know compiling lists of ideas or
414
00:40:05,180 --> 00:40:14,660
tools or strategies, that'll probably
never be accessed. And as a vehicle to do
415
00:40:14,660 --> 00:40:18,090
this, when we had a really positive
experience working with support
416
00:40:18,090 --> 00:40:23,720
organizations, on the front end that
allowed us to build trust with the
417
00:40:23,720 --> 00:40:28,050
community, so by working with people who
they already trusted and who already knew
418
00:40:28,050 --> 00:40:33,140
them well I really think we were able to
talk to people much more openly and much...
419
00:40:33,140 --> 00:40:37,200
with much more trust than they would have
otherwise. Whether they would have spoken
420
00:40:37,200 --> 00:40:43,370
to us at all is a question. They also were
a great resource for us as we were
421
00:40:43,370 --> 00:40:50,290
developing interview materials and also
like training materials afterwards when we
422
00:40:50,290 --> 00:41:00,020
went back to communities and conducted
digital trainings. They helped us develop,
423
00:41:00,020 --> 00:41:05,700
you know, culturally sensitive language
and we were able to just ask, you know, is
424
00:41:05,700 --> 00:41:10,370
this location is this style of
presentation, is this length, is this time
425
00:41:10,370 --> 00:41:14,290
what should we do you know they were a
resource to us to make sure that the
426
00:41:14,290 --> 00:41:16,900
things that we were developing were most
accessible to the people that we're
427
00:41:16,900 --> 00:41:25,260
talking to. And, they also themselves from
what I've seen have a lot of questions
428
00:41:25,260 --> 00:41:30,180
about the way that they're using
technology. That's a great place to go and
429
00:41:30,180 --> 00:41:35,950
talk to people about, you know,
organizational practices. And you might
430
00:41:35,950 --> 00:41:38,700
find that it's a lot easier to get people
to change their practices if they're in
431
00:41:38,700 --> 00:41:42,580
sort of an organizational setting where
there's peer pressure or maybe some
432
00:41:42,580 --> 00:41:48,950
hierarchy of people who are really
encouraging them to use more secure tools
433
00:41:48,950 --> 00:41:53,670
or to think carefully about data
they're collecting about people that they
434
00:41:53,670 --> 00:41:59,490
contact. So working with these
organizations also might be an opportunity
435
00:41:59,490 --> 00:42:05,440
to do trainings with activists and with
lawyers and with other people who are
436
00:42:05,440 --> 00:42:18,410
working alongside this community. Finally,
which is always a difficult thing to hear
437
00:42:18,410 --> 00:42:24,230
as a trainer, the people we spoke to
probably aren't going to be adopting new
438
00:42:24,230 --> 00:42:32,130
tools for one it might not be safe, it's
hard to make that calculus right, but a
439
00:42:32,130 --> 00:42:38,390
tool that's specifically designed for a
community at risk or in order to do a
440
00:42:38,390 --> 00:42:42,570
particular function that would be of
interest to, for example, the undocumented
441
00:42:42,570 --> 00:42:45,800
community or some other vulnerable
community might increase visibility
442
00:42:45,800 --> 00:42:50,230
depending on the threat model. If they're
found with a particular app or if the app
443
00:42:50,230 --> 00:42:57,670
is like exposing number of users or
location of users, for example. And it's
444
00:42:57,670 --> 00:43:00,790
not to say that we should stop developing
new tools we should always think about
445
00:43:00,790 --> 00:43:06,840
ways to make better and safer and more
private resources. But it's worth thinking
446
00:43:06,840 --> 00:43:10,780
especially if you're going to be working
with communities or building resources for
447
00:43:10,780 --> 00:43:15,560
communities that we should think also
about how to make sure that they're using
448
00:43:15,560 --> 00:43:21,040
the tools they are already used more
effectively and more safely. That might
449
00:43:21,040 --> 00:43:24,840
mean sitting down with someone for a while
and going to their privacy settings on
450
00:43:24,840 --> 00:43:30,200
Facebook or, you know, making sure that
their settings on Whatsapp, make don't
451
00:43:30,200 --> 00:43:38,450
back up data to the cloud or expose phone
numbers to people they don't know. But
452
00:43:38,450 --> 00:43:48,750
there's a lot to do in both of these
directions. And especially if you're going
453
00:43:48,750 --> 00:43:53,960
to be moving into working with these
communities, this is something to keep in
454
00:43:53,960 --> 00:44:04,920
mind, that I thought was especially
poignant. For that I can take questions.
455
00:44:04,920 --> 00:44:16,890
applause
Herald angel (H): So we have four
456
00:44:16,890 --> 00:44:21,930
microphones in this room. I see one is
already occupied with somebody. May I
457
00:44:21,930 --> 00:44:25,290
remind you that a question is typically
one to two sentence and ends with a
458
00:44:25,290 --> 00:44:30,640
question mark. And with that I
will take microphone 4.
459
00:44:30,640 --> 00:44:36,800
Mic4: Hi, thanks! You mentioned that these
communities are reluctant to adopt new
460
00:44:36,800 --> 00:44:41,500
tools. Were there any exceptions to that
or were there any like attributes of new
461
00:44:41,500 --> 00:44:45,830
tools that you think they would be more
likely to adopt?
462
00:44:45,830 --> 00:44:52,560
Allison: Yeah that's a good question! I
I've been thinking about this. I would say
463
00:44:52,560 --> 00:44:57,200
that this is absolutely true what I said
about reluctance to adopt new tools when
464
00:44:57,200 --> 00:45:00,660
it's when we're talking about social
media. So it's difficult to like move
465
00:45:00,660 --> 00:45:05,330
people to Signal for example from Whatsapp
or Facebook Messenger because the people
466
00:45:05,330 --> 00:45:08,890
they talk to are already on these tools
and it's not just moving one person but
467
00:45:08,890 --> 00:45:16,680
like a community. If we start to think
about tools that might be special-purpose
468
00:45:16,680 --> 00:45:21,180
we didn't talk to anyone who mentioned
this app but I know in the past there have
469
00:45:21,180 --> 00:45:26,410
been discussions about ways being used
it's like a crowd-sourced map system being
470
00:45:26,410 --> 00:45:33,680
used to like track law enforcement. Like I
said we didn't talk to anyone who used
471
00:45:33,680 --> 00:45:40,020
that app but possibly if there's like a
specific utility in it there could be some
472
00:45:40,020 --> 00:45:46,630
critical mass of people who spread the
word in a smaller community. Yeah it's
473
00:45:46,630 --> 00:45:50,440
something to think about. I don't think
it's impossible but I would say it would
474
00:45:50,440 --> 00:45:55,880
be challenging.
H: I assume that the baby doesn't want to
475
00:45:55,880 --> 00:46:00,852
speak on microphone 1 so I'm gonna go to a
microphone 3.
476
00:46:00,852 --> 00:46:03,410
Mic3: I have two questions is that okay?
Allison: Yeah.
477
00:46:03,410 --> 00:46:07,780
Mic3: Thank you. The first one is kind of
a nitty-gritty academic question and that
478
00:46:07,780 --> 00:46:12,100
is: can you tell us anything about your
IRB approval process, what you're doing to
479
00:46:12,100 --> 00:46:16,360
protect subjects data? Because this is
very sensitive and I'm curious how you've
480
00:46:16,360 --> 00:46:19,360
approached that.
Allison: Yeah absolutely. So we didn't
481
00:46:19,360 --> 00:46:28,310
have IRB approval before we spoke to
anyone. We actually got an exemption for
482
00:46:28,310 --> 00:46:32,940
collecting data about participants. So we
compensated for each interview that we
483
00:46:32,940 --> 00:46:42,170
did, we gave participants $20. We were not
required to collect any proof of payment
484
00:46:42,170 --> 00:46:48,600
we recorded the interviews and encrypted
them locally. They were translated by
485
00:46:48,600 --> 00:46:55,330
people in our research group and then
transcribed with all identifying location
486
00:46:55,330 --> 00:47:01,950
and name data redacted. And, that those
were all stored encrypted on our personal
487
00:47:01,950 --> 00:47:07,660
drives and then in a University Drive. All
the data has been deleted now all of the
488
00:47:07,660 --> 00:47:12,538
original data as well.
Mic3: Awesome! Thanks. The other one is a
489
00:47:12,538 --> 00:47:17,050
big picture scatterbrain question: which
is about how this is a technological
490
00:47:17,050 --> 00:47:23,830
solution to a political problem. Do you
feel that directing or helping immigrants
491
00:47:23,830 --> 00:47:29,400
understand how to protect themselves
technologically, is the answer or
492
00:47:29,400 --> 00:47:34,090
necessarily part of the answer or do you
feel like maybe eventually our community
493
00:47:34,090 --> 00:47:37,520
needs to be helping people exit places
like the U.S. that are increasingly
494
00:47:37,520 --> 00:47:43,384
hostile to immigrants?
Allison: That's a good question. I don't
495
00:47:43,384 --> 00:47:49,630
think that helping people be more safe
online is really a solution. I mean the
496
00:47:49,630 --> 00:47:55,061
solutions gonna be in policy and in law. I
think this is a utility really in the
497
00:47:55,061 --> 00:47:58,660
short term is like making sure people feel
safe and like have more control over
498
00:47:58,660 --> 00:48:03,440
disclosure to the extent that they can.
But I don't think that's going to,... I
499
00:48:03,440 --> 00:48:09,070
don't think that's a winning, you know,
single pronged battle. As for leaving the
500
00:48:09,070 --> 00:48:14,080
United States that's kind of a funny
question considering how much people have
501
00:48:14,080 --> 00:48:17,850
sacrificed to come to the U.S. and
especially having integrated into
502
00:48:17,850 --> 00:48:23,000
communities already. A lot of the people I
spoke about today were long-term residents
503
00:48:23,000 --> 00:48:26,190
I mean everyone was a long-term resident.
So they've sort of built their lives in
504
00:48:26,190 --> 00:48:29,600
the U.S. But there has been a significant
decrease in the number of people
505
00:48:29,600 --> 00:48:35,260
immigrating to the U.S. without
authorization that's thanks to Obama era
506
00:48:35,260 --> 00:48:41,110
policies of like, you know, return
immediately at the border so whether
507
00:48:41,110 --> 00:48:44,560
people are now moving to other countries
is a good question and whether we should
508
00:48:44,560 --> 00:48:48,480
encourage that is... I don't know,
interesting.
509
00:48:48,480 --> 00:48:54,543
Mic3: Thank you
H: Microphone 2.
510
00:48:54,543 --> 00:49:02,100
Mic2: Hi, so I have a questions: Are there
any initiatives to help the people in a
511
00:49:02,100 --> 00:49:11,910
way that so,.. The fact that they don't...
they feel that they are less risk online
512
00:49:11,910 --> 00:49:16,650
and they don't perceive the risk as much
and do you feel that helping them
513
00:49:16,650 --> 00:49:21,270
understanding those risk and maybe trying
to be more secure online will actually
514
00:49:21,270 --> 00:49:27,160
help them or is there a resignation
towards the government accurate?
515
00:49:27,160 --> 00:49:41,970
Allison: If you're thinking about specific
people I think,... Maybe when individual's
516
00:49:41,970 --> 00:49:47,470
information is going to be accessible in
the long run if immigration enforcement
517
00:49:47,470 --> 00:49:51,940
really chooses to maybe that sense of
resignation to some extent is accurate but
518
00:49:51,940 --> 00:49:58,220
lots of people aren't necessarily on the
radar. And I think what's most beneficial
519
00:49:58,220 --> 00:50:03,590
about helping people understand how to use
technology more effectively and like
520
00:50:03,590 --> 00:50:08,960
that's really just increasing confidence.
It's this uncertainty and like choosing to
521
00:50:08,960 --> 00:50:11,560
abstain from participating in
conversations because they just don't
522
00:50:11,560 --> 00:50:15,211
trust that they can be secure, like
private enough. You know or that their
523
00:50:15,211 --> 00:50:18,860
personal information, their home addresses
that they they're still at risk of this
524
00:50:18,860 --> 00:50:23,660
harassment like that's... That lack of
confidence and privacy is really what I
525
00:50:23,660 --> 00:50:40,210
think can be helped and... Sorry I had
another point. Yeah, but if it's worthwhile
526
00:50:40,210 --> 00:50:44,230
you know thinking about how you can
contribute to helping. I mean even
527
00:50:44,230 --> 00:50:50,980
outside of like privacy work, a lot of
people really just are eager to learn more
528
00:50:50,980 --> 00:50:58,880
about how to use technology like to help
their lives. Right, so the other thing I
529
00:50:58,880 --> 00:51:04,150
was going to say was, we also put
significant thought into whether or not,
530
00:51:04,150 --> 00:51:06,880
you know, how to have these conversations
with people and like how to ask questions
531
00:51:06,880 --> 00:51:12,960
about, you know, the risks online without
really freaking them out. Because we
532
00:51:12,960 --> 00:51:15,930
didn't really have solutions. It's not
like at the end of an interview we could
533
00:51:15,930 --> 00:51:20,560
say like well we have a solution for you
just install this app and you'll be safe.
534
00:51:20,560 --> 00:51:25,390
So, it's sort of this balance between
making sure that people still, you know,
535
00:51:25,390 --> 00:51:30,600
use tools in the way that's so helpful for
their lives. Right like we don't want them
536
00:51:30,600 --> 00:51:33,920
to stop using Facebook if it means that
they stop talking to their parents back in
537
00:51:33,920 --> 00:51:37,640
Mexico. We don't want them to stop using
email if it means that they can't talk to
538
00:51:37,640 --> 00:51:42,730
their kid's teachers anymore. So it's this
balance between like being aware of the
539
00:51:42,730 --> 00:51:45,900
risk and being confident that you're doing
as much as you can while not choosing to
540
00:51:45,900 --> 00:51:50,060
abstain.
H: So I'm hiding here in the corner
541
00:51:50,060 --> 00:51:53,630
because I'm trying to see whether
somebody's at number four? There's
542
00:51:53,630 --> 00:51:58,890
somebody there yes. So Mic4 please.
Mic4: Thanks. Hi, so I was wondering since
543
00:51:58,890 --> 00:52:04,810
Facebook is the most popular tool that
they use and they probably won't change
544
00:52:04,810 --> 00:52:10,760
it, did you find anything that the people
at Facebook could do to help undocumented
545
00:52:10,760 --> 00:52:15,020
immigrants more?
Allison: Yeah, I think the things that
546
00:52:15,020 --> 00:52:18,570
Facebook can think about are really
generalizable to a lot of vulnerable
547
00:52:18,570 --> 00:52:25,131
communities. People, there were a few
things in particular that some people are
548
00:52:25,131 --> 00:52:31,180
really uncomfortable with, for example,
Whatsapp if you're added to like a group
549
00:52:31,180 --> 00:52:35,620
of people your phone number is exposed to
everyone else in the group, without your
550
00:52:35,620 --> 00:52:39,730
consent and that might be the case with
like group SMS and things. But like, the
551
00:52:39,730 --> 00:52:44,250
fact that Whatsup even uses a phone number
is kind of something that we should
552
00:52:44,250 --> 00:52:50,650
migrate out of, right. Facebook collecting
phone numbers and collecting, you know,
553
00:52:50,650 --> 00:52:59,330
location data regardless of how easy it is
to opt in and out. And so, this is
554
00:52:59,330 --> 00:53:05,680
primarily an academic work that's going to
appear at the HCI, a human-computer
555
00:53:05,680 --> 00:53:11,140
interaction conference, and we talk a lot
in the paper about what these bigger
556
00:53:11,140 --> 00:53:18,990
services can do. And really like we as a
community can advocate for Facebook
557
00:53:18,990 --> 00:53:23,240
resisting cooperating with law enforcement
right. I mean it shouldn't really matter
558
00:53:23,240 --> 00:53:28,380
to Facebook where you live or or how you
got there. They're a social media platform
559
00:53:28,380 --> 00:53:33,750
they shouldn't be, you know, helping
immigration move people around physical
560
00:53:33,750 --> 00:53:42,060
borders. They should be totally you know
border agnostic. So advocating for that
561
00:53:42,060 --> 00:53:49,790
kind of attitude shift would be helpful
H: Microphone 2
562
00:53:49,790 --> 00:53:54,290
Mic2: So thank you for the very
interesting talk. And I have a question
563
00:53:54,290 --> 00:54:00,410
that sort of picks up on the previous one.
And because it's, you talk about it
564
00:54:00,410 --> 00:54:06,230
Facebook has become such an important sort
of a political actor in this arena. I'm
565
00:54:06,230 --> 00:54:10,440
wondering if you've been following up on
that as a survey research problem like
566
00:54:10,440 --> 00:54:15,220
what's, what is there, what is it that
they are doing and is this something
567
00:54:15,220 --> 00:54:23,190
that's happening unwittingly or is there
something about the general strategy of
568
00:54:23,190 --> 00:54:29,380
Facebook that surf helps create this kind
of trust. And I'm also wondering, going,
569
00:54:29,380 --> 00:54:35,870
taking that question further, sorry it's
more than a sentence that,
570
00:54:35,870 --> 00:54:40,220
if you've been thinking about is if you
see anything sort of suddenly eroding that
571
00:54:40,220 --> 00:54:45,030
trust in the future, and I'm specifically
thinking about this now, this question
572
00:54:45,030 --> 00:54:52,880
about how it was possible for all this
Russian money to go into Facebook
573
00:54:52,880 --> 00:54:59,910
advertisements and that served, that's
kind of point in the direction of pressure
574
00:54:59,910 --> 00:55:08,350
for Facebook to be less serve general in
their trust and picking up on certain, on
575
00:55:08,350 --> 00:55:15,650
specific political issues which could also
be immigration and disclosing some
576
00:55:15,650 --> 00:55:21,340
information that they already have?
A: Your question about whether there could
577
00:55:21,340 --> 00:55:26,640
be a shift in trust in the future if
something could trigger that. The example
578
00:55:26,640 --> 00:55:31,030
in Detroit right where law enforcement was
able to get a phone number from Facebook
579
00:55:31,030 --> 00:55:36,530
with a warrant and then track the person
with this phone number. If there are more
580
00:55:36,530 --> 00:55:41,780
and more cases of social media data being
used in immigration cases and there's
581
00:55:41,780 --> 00:55:48,370
evidence to think that that might happen.
It's possible that narrative might
582
00:55:48,370 --> 00:55:52,430
overtake this sense that people have right
now that Facebook's looking out for them
583
00:55:52,430 --> 00:56:00,750
by keeping their account, you know,
there's that letting them control it. In
584
00:56:00,750 --> 00:56:08,070
terms of Facebook picking up immigration
as a sort of an activist or a political
585
00:56:08,070 --> 00:56:15,200
topic that they're interested in, I would
now hold my breath on that one, but we'll
586
00:56:15,200 --> 00:56:19,350
see. Yeah.
H: So we have time for exactly one more
587
00:56:19,350 --> 00:56:26,160
question and that is on Mic 1.
Mic1: Hi, did you collect any information
588
00:56:26,160 --> 00:56:32,260
or study anything about how these people
were using financial services and such
589
00:56:32,260 --> 00:56:36,720
things like online payments? Did they have
bank accounts, were they concerned about
590
00:56:36,720 --> 00:56:44,920
their financial privacy?
A: Yeah, actually people, the concerns
591
00:56:44,920 --> 00:56:49,050
they have with privacy and in terms of the
way that they were using like online
592
00:56:49,050 --> 00:56:53,540
banking because people were I mean using
credit cards and online banking and paying
593
00:56:53,540 --> 00:56:59,050
rent, you know, or utilities online. They
didn't talk about privacy much in that
594
00:56:59,050 --> 00:57:03,110
context except that they have this concern
about their financial information being
595
00:57:03,110 --> 00:57:07,700
stolen by hackers. Right, like the concern
is for other people rather than the
596
00:57:07,700 --> 00:57:15,930
entities that are providing these
services. And I think a lot of the concern
597
00:57:15,930 --> 00:57:19,370
there is coming from the fact that they
have a lot to lose and very few legal
598
00:57:19,370 --> 00:57:27,350
protections should something bad happened
to them. But, yeah, so just generally like
599
00:57:27,350 --> 00:57:32,690
people were using online banking and had
bank accounts and were using these online
600
00:57:32,690 --> 00:57:36,280
financials services. Some people were
opting out but it wasn't due to privacy
601
00:57:36,280 --> 00:57:39,710
concerns it was because they were worried
about using their credit card on the
602
00:57:39,710 --> 00:57:45,240
Internet.
H: So with that I'd like you to help me to
603
00:57:45,240 --> 00:57:47,844
thank our speaker Allison for this
wonderful talk.
604
00:57:47,844 --> 00:57:53,545
Applause
605
00:57:53,545 --> 00:58:03,257
34C3 postroll music
606
00:58:03,257 --> 00:58:15,000
subtitles created by c3subtitles.de
in the year 2020. Join, and help us!