-
It will also be re
broadcasted at a later date
-
on Comcast eight, RCN 82 SAS as per 196.
-
For a public testimony written comments
-
may be sent to the committee email
-
@ccc.go@boston.gov
-
and it will be made part
of the official record.
-
This is an ordinance
-
that would ban the use
of facial surveillance
-
by the city of Boston or any
official in the city of Boston.
-
The proposal would also prohibit
-
entering into agreements, contracts
-
to obtain face-to-face
surveillance with third parties.
-
I'm gonna turn it now
over to the lead sponsors,
-
councilor Wu and councilor Arroyo.
-
And then I will turn it to my colleagues
-
in order of arrival, for opening remarks.
-
And just to acknowledge my colleagues
-
and I believe the order that I had so far,
-
again it will be councilor Arroyo,
-
councilor Wu, councilor
Braden, councilor Bok,
-
councilor Mejia, councilor Campbell,
-
and then I had councilor
O'Malley, councilor Essaibi George
-
and then I'm sorry, I started talking.
-
So other councilors have joined us.
-
MICHAEL: Good morning madam chair.
-
Councilor Flaherty was before me
-
councilor Edwards, thanks.
-
I apologize councilor Flaherty,
-
then councilor Campbell,
then counselor melody
-
and councilor Essaibi George.
-
I will not turn it over
to the lead sponsors.
-
Does that start with me?
-
CHAIR: Sure.
-
Thank you, madam chair.
-
I think we have excellent panelists
-
to kind of go into the intricacies
-
of the facial recognition ban.
-
But to just kind of summarize it,
-
this facial recognition band
creates a community process
-
that makes our systems more inclusive
-
of community consents while
adding democratic oversight,
-
where there currently is none.
-
We will be joining our neighbors
-
and Cambridge and Somerville.
-
And specifically when it comes
to facial recognition tech,
-
it doesn't work, it's not good.
-
It it's been proven through the data
-
to be less accurate for
people with darker skin.
-
A study by one of our
panelists Joy Buolamwini,
-
researcher at MIT,
-
found that black women
-
were 35% more likely than white men
-
to be misclassified by face
surveillance technology.
-
The reality is that face surveillance
-
isn't very effective in its current form.
-
According to records obtained by the ACLU,
-
one manufacturer promoting
this technology Massachusetts
-
admitted that it might only
work about 30% of the time.
-
Currently, my understanding
-
is that the Boston Police
Department does not use this.
-
This isn't a ban on surveillance.
-
This is simply a ban on facial
recognition surveillance,
-
which I think the data
has shown doesn't work.
-
BPD doesn't use it because,
-
and I will allow commissioner
Gross to speak on it.
-
But my understanding from what I've heard
-
and past comments from commissioner Gross
-
is that they don't wanna use technology
-
that doesn't work well.
-
And so this ban necessarily
even a permanent ban,
-
it just creates a process
where we are banning something
-
that doesn't have to be process
-
that doesn't have community trust
-
and doesn't have community consent
-
in which would require
some democratic oversight
-
if they ever do want to implement it,
-
if the technology in the
future ever does improve.
-
And so with that, I see
the rest of my time.
-
Thank you, madam chair.
-
And thank you also to my
co-sponsor councilor Wu.
-
Thank you very much madam chair.
-
Thank you also to councilor Arroyo
-
and all of the coalition partners
-
that have been working
on this for many months
-
and providing all sorts of
incredibly important feedback.
-
We scheduled this hearing
many weeks ago at this point,
-
I think over a month ago.
-
And it just so happened that
the timing of it now has lined
-
up with a moment of great national trauma.
-
And in this moment,
-
the responsibility is on
each one of us to step up,
-
to truly address systemic
racism and systemic oppression,
-
but the responsibilities,
-
especially on elected
officials, policy makers,
-
those of us who have the power
-
and the ability to make change now.
-
So I am proud that
Boston has the potential
-
to join our sister cities
across Massachusetts
-
in taking this immediate
step to ban a technology
-
that has been proven to
be racially discriminatory
-
and that threatens basic rights.
-
We are thankful that Boston police
-
agree with that assessment
-
and do not use facial recognition,
-
facial surveillance today in Boston,
-
and are looking to
codify that understanding
-
so that with the various
mechanisms for technology
-
and upgrades and changing circumstances
-
of different administrations
and personalities,
-
that we just have it on the books
-
that protections come first,
-
because we truly believe
that public health
-
and public safety should
be grounded in trust.
-
So the chair gave a wonderful description
-
of this ordinance.
-
I am looking forward to the discussion,
-
but most of all I'm grateful to colleagues
-
and everyone has con who has contributed
-
to this conversation,
-
as well as the larger conversation about
-
community centered
oversight of surveillance,
-
which will be in a
separate but related docket
-
that will be moving
forward on the council.
-
Thank you very much.
-
Councilor Breadon, opening remarks.
-
Thank you.
-
Thank you, madam chair,
-
thank you to the makers of this ordinance.
-
I really look forward to the
conversation this afternoon.
-
I'm very proud to participate
in this discussion.
-
I think it's the proper
and right thing to do
-
to really consider deeply
all the implications
-
of new technology that hasn't
been proven to be effective
-
before introducing it in
any city in Massachusetts.
-
So I really look forward to learning more
-
and hearing from the panelists
this afternoon, thank you.
-
CHAIR: Councilor Bok.
-
Ah, yes.
-
Thank you, madam chair.
-
And I hope everyone will forgive me
-
for being outside for a minute.
-
I wanted to just say,
-
I'm excited about this conversation today.
-
I think it's a piece of action
we should absolutely take.
-
And I just wanna say that,
-
I think it's really important
that this technology
-
has major shortcomings
-
and that also it has a
racial bias builds into it.
-
Those are really good
reasons not to use it,
-
but honestly, even if those did not hold,
-
I think that in this country,
-
we run the risk sometimes of
just starting to do things
-
that tech new technology can do,
-
without asking ourselves like
as a democratic populous,
-
as a community, is this
something that we should do?
-
And I think in the process,
-
we as Americans have given up
-
a lot of our rights and privacy,
-
frankly as often as not
to private corporations
-
as to government.
-
But I think that it's really important
-
to just remember that there's
a true public interest
-
in a society that is built more on trust
-
than on surveillance.
-
And so I think it's really important
-
for us to draw this line in the sand
-
so that if the technology improves
-
and some of those arguments
-
that are based on a kind
of its functionality
-
start to slip away, that
we're still gonna have
-
a real conversation together
-
about how we wanna live in
-
and what kind of a society we want to be.
-
So to me, putting a
moratorium ban like this
-
into place, makes a ton of sense.
-
And I'm excited for the conversation
-
and grateful to all the advocates.
-
Thank you so much madam chair
-
Thank you.
-
Councilor Mejia.
-
Yes, thank you.
-
Sorry about my audio.
-
Thank you chair, women Edwards,
-
and thank you to the makers, councilor Wu,
-
and I applaud you for your
leadership on this issue.
-
For me this issue is
personal and professional
-
as a Dominican woman who
claims a black boots.
-
Facial recognition technology
-
misidentifies people like me by 35%.
-
We're in a time where our technology
-
is outpacing our morals.
-
We've got a 2020 technology
with a 1620 state of mind,
-
but that's thinking.
-
As a city, we need to
practice extreme caution
-
over facial recognition technology.
-
And we should be concerned
-
not only with this technology goes wrong,
-
but also when it goes right.
-
We have a lot of work to do
-
when it comes to building
trust in our government.
-
Working against this facial
recognition technology
-
is a good step and I'm happy to hear
-
there's pushback against
this kind of surveillance
-
on all sides of the issue.
-
Like the makers of the
ordinance mentioned,
-
now, right now, there's not a desire
-
to put facial recognition
technology in place.
-
And this ordinance is an
opportunity to put that into law.
-
I look forward to the discussion
-
and I hope to continue
to push on this issue
-
alongside councilor Wu and Arroyo,
-
thank you so much.
-
Thank you very much.
-
Councilor Flaherty.
-
MICHAEL: Thank you madam chair
-
and the sponsors for this hearing.
-
As I've stated in the past
the technology updates
-
and advances have provided
us with many advantages,
-
but at the same time,
-
they've also proven to be deeply flawed,
-
unreliable and disproportionally impacting
-
communities of color.
-
This technology,
-
just for everyone's full
disclosure if you listening,
-
this technology is currently
not being used by the BPD
-
for that reason and CAC
commissioner is here
-
and good afternoon to the commissioner.
-
I appreciate the work
that you've been doing
-
through all of this.
-
Whether it's just the
day-to-day public safety,
-
rigors of your job,
-
or it's been through
the COVID-19 response,
-
or it's been, most recently and tragically
-
in response to George
Floyd's horrific death
-
and everything that has come from that.
-
I wanna make sure
-
that I'm gonna continue
to remain supportive
-
of setting transparent limits
in creating a public process
-
by which we assess the
use of these technologies.
-
I know that we've been able
to solve a lot of crimes
-
through video surveillance
-
and also through video, quite frankly,
-
and also they've led
to justice for families
-
and someone that has lost
a cousin due to murder,
-
but for, it was surveillance
-
and it was DNA that led to
my family getting justice.
-
So looking forward to
learning more about this
-
and the commissioner's perspective on it,
-
but also recognizing that collectively
-
we need to make sure that
we're putting forth decisions
-
that make the most sense for Boston.
-
I'll get real parochial here.
-
Quite frankly, I'm not really
concerned about Cambridge
-
and Somerville, I wanna make sure
-
that whatever we're doing
in Boston is impacting
-
all of our neighborhoods.
-
The neighborhoods that I represent
-
as a city wide city councilor.
-
And making sure that we
have community input,
-
most importantly.
-
We work for the people
of Boston, as council
-
and obviously as the
commissioner of police officers,
-
we work for the residents and
the taxpayers of our city.
-
So they need a seat at the table
-
and we wanna to hear from them.
-
If the technology enhances
-
and it help us is public
safety tool, great.
-
But as the time stands,
-
it's still unreliable and
there are flaws to it.
-
So as a result that's
where I'm gonna be on this.
-
Thank you madam chair.
-
Thank you very much.
-
Councilor Campbell.
-
Thank you, madam chair and thank you,
-
I know you're gonna run a tight ship
-
given what's all that's happening today,
-
including the funeral, of
course for George Floyd.
-
So thank you very much.
-
Thank you also to, of course,
-
to the commissioner for being here,
-
given all that's going on.
-
I know you've been on
the record in the past
-
with respect to this issue
-
and that not only does the
department not have it,
-
but you understand the flaws of it.
-
So thank you for being here.
-
Thank you to my council colleagues,
-
but als most importantly the makers
-
for putting this forward.
-
I have been supportive of
this since the very beginning,
-
when I hosted,
-
along with councilor Wu
and counselor McCarthy,
-
a hearing on this very issue
-
along with some other
surveillance technologies
-
and things that were possibly
coming down the pipe.
-
So this was an opportunity for us
-
to have this conversation on
what we could do proactively
-
to make sure that surveillance tools
-
were not put into place without
a robust community process.
-
And of course, council input.
-
So back then I supported
it and I support it now.
-
Again thank you commissioner,
thank you to your team,
-
thank you to the makers
and thank you madam chair.
-
And Aiden also says, thank you.
-
Thank you Aiden and thank
you councilor Campbell.
-
councilor O'Malley.
-
Thank you madam chair.
-
Just very briefly wanna thank the makers,
-
councilor Wu and Arroyo
for their leadership here.
-
Obviously I support the
facial recognition ban.
-
Also grateful, wanna echo my
thanks to the commissioner
-
for his comments and his opposition,
-
the fact that this isn't used.
-
So it's important that we
codified that in such a way,
-
which again is the purpose of this hearing
-
and this ordinance and
stand ready, willing
-
and able to get to work.
-
Thank you and look forward
-
to hearing from a number of residents,
-
our panelists, as well as a
public testimony after, thanks.
-
Thank you very much,
councilor Essaibi George.
-
Thank you madam chair,
-
and thank you to the commissioner
-
for being with us this afternoon.
-
I look forward to hearing from him
-
and from the panelists discussing
this ordinance further.
-
I think that it is necessary
-
to ban this technology at this point.
-
I think that our colleagues
have brought to the forefront,
-
over the last few years
as have the advocates,
-
around the concerns regarding
this kind of technology.
-
I appreciate the real concerns
around accuracy and biases
-
against people of color
women, children, the elderly,
-
and certainly reflective
of our city's residents.
-
So we need to make sure
that we're proactive
-
and protecting our residents
-
and having a deeper understanding
-
and appreciation of this technology
-
and understanding what
this ordinance would mean
-
around that ban and
following through with it.
-
Thank you ma'am chair.
-
Thank you very much.
-
I understand councilor
Flynn is trying to get in
-
and there might be a limits.
-
I'm just saying that for our tech folks.
-
In the meantime, I'm
going to read a letter
-
from city council president Kim Janey,
-
"Dear chairwoman Edwards, due
to technical difficulties,
-
I'm unable to log in for
today's hearing on docket 0683,
-
regarding an ordinance
-
banning facial recognition
technology in Boston.
-
I will continue to work on
the technical challenges
-
on my end and we'll join
the hearing at a later time
-
if I am able, if I'm unable to join,
-
I'll review the recording at a later time.
-
I want to thank the makers
councilors Wu and Arroyo
-
for their advocacy and
continued leadership
-
on this critical issue.
-
I especially want to
thank all the advocates
-
who are participating
-
for their tireless advocacy on this issue.
-
I look forward to working with advocates
-
to incorporate their ideas
-
and how we can push this
issue moving forward.
-
It is clear we need policies in place
-
to protect communities that
are vulnerable to technologies
-
that are biased against them.
-
I look forward to working
with my colleagues
-
on this critical issue,
-
and we'll continue to fight for an agenda
-
that promotes and protects black lives.
-
Please read this letter
into the record, thank you."
-
Sincerely Kim Janey.
-
I'm not sure if we've been
joined by councilor Flynn.
-
I do know he was trying to get in.
-
In the meantime I did want to just do
-
some quick housekeeping as part
of my introductory remarks.
-
Just to let people know again,
-
today's hearing is specifically
-
about this proposed ordinance
-
that would ban facial
recognition technology.
-
We had a robust conversation
about defunding the police,
-
discussing all aspects of the budget,
-
all of those different things just happen
-
and it was chaired by councilor Bok.
-
And I wanna make sure that
our conversations are related,
-
that we do not take this precious
moment on this legislation
-
and have it covered up by
the other conversation.
-
I just want you to know that
-
and again, I will move the conversation
-
towards this ordinance.
-
There should be a copy of it.
-
People should have the ability
-
to discuss about this
language in this ordinance.
-
Questions I have,
-
I do want to talk about,
whether time is of the essence.
-
There seem to been some issue,
-
I'm concerned about whether
we were entering a contract
-
that allowed for this.
-
So when the commissioner speaks,
-
I'd like for him to speak on that.
-
I'm not sure if there's a
contract pending what's going on.
-
I also wanna make sure
that folks understand
-
due to the fact that I think
-
we have possibly a hundred
people signed up to speak.
-
We have a lot of folks
-
who are gonna be speaking on the panel.
-
I'm going to limit my
colleagues to three minutes
-
in their first round.
-
And I'm gonna get the
public to no more than two.
-
And I'm gonna try my best
to get through that list.
-
So I see we've been
joined by councilor Flynn,
-
councilor Flynn, do you
have any opening remarks?
-
Thank you.
-
Thank you, counsel Edwards,
-
and to the sponsors and looking
forward to this hearing,
-
learning, learning more about the proposal
-
and also interested in hearing
-
from the commissioner of
police, commissioner Gross
-
and with the public as well.
-
Again, this is an issue that's
that we all can learn from.
-
We can listen to each
other, we can be respectful
-
and hear each other's point of view.
-
And that's what this city is all about,
-
is coming together and
treating each other fairly.
-
And with respect, especially
during difficult days
-
and on and on difficult issues.
-
I also know that our police commissioner
-
is someone that also listens to people
-
and he's in the neighborhoods and he talks
-
and listens to a lot of the
concerns of the residents.
-
That's what I like most about
the police commissioner,
-
is his ability to listen
and treat people freely.
-
Again, I'm here to listen
and learn about the proposal.
-
Thank you councilor Edwards
for your strong leadership.
-
Thank you councilor Flynn.
-
I just wanna let people know,
-
we actually have a a hundred
percent limit on the Zoom.
-
We have 118 people trying to get in.
-
So once you've testified,
-
could you at least sign out,
-
that allows for other people to get in.
-
Commissioner Gross you've been waiting.
-
So I'm gonna turn the
microphone over to you.
-
I'm gonna ask you get
right to the ordinance
-
and get to the trends
and thoughts on that.
-
I'm happy to discuss any procedure.
-
Then we're gonna try
and get to the panelists
-
and move back to the city
council while you're still here.
-
Okay.
Yes, thank you.
-
Good afternoon everyone.
-
And God bless Mr. George
Floyd, may he rest in peace.
-
And for the record, I'm proud of Boston,
-
as they paid great homage
and peaceful protest
-
and anything we can do to
move our department forward,
-
working with our community.
-
You have it on record right here,
-
and now I'm willing to
work with you councilors,
-
the mayor, in our great city.
-
So, thank you.
-
So I will start and thank you the chair
-
and the committee members and the makers,
-
councilor Wu and Arroyo.
-
I really thank you for
inviting me to participate
-
in today's hearing regarding the ordinance
-
banning facial recognition
technology in Boston.
-
As you can imagine the
Boston Police Department
-
has been extremely busy
meeting the current demands
-
of public safety from
the COVID-19 pandemic
-
and the public rallies.
-
However, the department and I understand
-
the importance of the given topic,
-
I would like to request
-
that hearing remain on the given topic
-
and that we work together
to find other times
-
to discuss the important issues
regarding police relations,
-
national issues on police reform
-
and the ways that we can improve
-
our relationships here in Boston.
-
As has been practiced in the past,
-
we welcome a future opportunity
-
to discuss the ordinance language
-
and city council concerns
in a working session
-
regarding technology policy
-
and potential privacy concerns.
-
My testimony today is meant
to serve as a background
-
to BPD current practices
-
and identify potential
technological needs.
-
The department for the record,
-
does not currently have the technology
-
for facial recognition.
-
As technology advances however,
-
many vendors have and will
continue to incorporate
-
automated recognition abilities.
-
We have prohibited these features
-
as we have moved along in our advancements
-
with the intention to have rich
dialogue with the community
-
prior to any acquisition
of such a technology.
-
Video has proven to be one
of the most effective tools
-
for collecting evidence
of criminal offenses,
-
for solving crimes and locating missing
-
and exploited individuals.
-
Any prohibitions on
these investigative tools
-
without a full understanding
of potential uses
-
under strict protocols,
-
could be harmful and impede our ability
-
to protect the public.
-
The proposed ordinance defines
facial surveillance to mean,
-
an automatic automated
or semi-automated process
-
that assist in identifying
or verifying an individual
-
or in capturing information
about an individual
-
based on the physical characteristics
-
of an individual's face.
-
And to be clear,
-
that department has no desire
-
to employ a facial surveillance system
-
to generally surveil
the citizens of Boston.
-
The department however notes a distinction
-
between facial surveillance systems
-
and facial recognition technology.
-
The department believes
-
that the term facial
recognition technology
-
better describes the investigative nature
-
of the technology we believe
would be useful to the city,
-
with the right safeguards
and community input.
-
The department would,
-
as technology advances
and becomes more reliable,
-
like the opportunity to
discuss the utilization
-
of facial recognition technology
-
to respond to specific crimes
and emergency situations.
-
And I would like to
revisit everyone's thoughts
-
and remembrances of the Boston Marathon
-
and the most recent kidnappings.
-
Facial recognition technology
to the best of our knowledge,
-
will greatly reduced the hours necessary
-
to review video evidence.
-
This would also allow
investigators to move more quickly,
-
identify victims, missing persons
-
or suspects of crime through
citywide camera recordings.
-
This technology, if acquired
-
would also allow for the compilation
-
and condensing a video to
be done more efficiently.
-
Facial recognition technology
-
used with well-established guidelines
-
and under strict review,
-
can also provide a safer environment
-
for all in the city of Boston.
-
Creating a system in
which evidence is reviewed
-
in a timely manner and efficiently
-
allows for police to intervene
-
and reduce levels of
crime would be beneficial.
-
The department rejects any
notion in the ordinance
-
that would whatever use
facial recognition technology
-
in our response to COVID-19 pandemic.
-
And just for the record,
-
I'd like to repeat that,
-
the department rejects any
notion in the ordinance
-
that we are or will ever use
facial recognition technology
-
in our response to COVID-19 pandemic.
-
Finally,
-
it is important to note,
-
the department shares the
concerns of our community
-
and our community concerns around privacy
-
and intrusive surveillance,
-
as it is the current practice
with existing technology,
-
our intentions are to implement
-
any potential future advances
-
in facial recognition technology,
-
through well-defined
protocols and procedures.
-
Some areas we consider, excuse me,
-
some areas we consider this
technology may be beneficial,
-
are in identifying the route
-
and locations of missing persons,
-
including the suffering from
Alzheimer's and dementia,
-
those suffering from
Alzheimer's and dementia.
-
As well, missing children,
kidnapped individuals,
-
human trafficking victims,
-
and the suspects of
assaults and shootings.
-
Also the victims of
homicide and domestic abuse.
-
Important to know,
-
we would like to work with
the council and our partners
-
to clearly articulate the language
-
that could best describe these uses.
-
In section B2 with the proposed ordinance
-
that indicates there is some intention
-
to account for the use of this technology
-
to advance investigations
-
but the department police
some additional language
-
referencing permitted uses is necessary.
-
We as well, would
appreciate a working session
-
within the next 60 days,
-
to draft language that we
can collectively present
-
to constituents and
requesting interest groups
-
for feedback and input.
-
We also plan to use the public testimony
-
being provided in today's hearing
-
to assist in our decision-making.
-
I'd like to thank you for this opportunity
-
to provide this testimony
-
and your continued interest and time
-
in creating public forums,
-
around police technology and advancements.
-
That's my official
statement that I read in.
-
And I'm telling you right now
as an African-American male,
-
the technology that is in place today
-
does not meet the standards of
the Boston Police Department,
-
nor does it meet my standards.
-
And until technology advances
-
to a point where it is more reliable,
-
again, we will need your
input, your guidance,
-
and we'll work together
to pick that technology
-
which is more conducive to our privacy
-
and our rights as citizens
of Boston, thank you.
-
Thank you very much commissioner Gross.
-
Do you, just before I go on,
-
we're gonna go now to Joy Buolamwini
-
and I am so sorry for that Joy.
-
I apologize profusely for the
butchering of your last name.
-
My'kel McMillen, Karina
Ham, Kade Crawford, Kate,
-
Erik Berg, and Joshua Barocas.
-
But I wanted to ask the
commissioner just very quickly,
-
you said you have suggested language
-
or that you would want to work
on suggested language for B2.
-
No, I would want to work on it.
-
So keep in mind anything
we do going forward,
-
as I promised for the last four years,
-
we want your input and your guidance.
-
Councilor Mejia you hit it on point.
-
This technology is not fair to everyone,
-
especially African-Americans, Latinos,
-
it's not there yet.
-
So moving forward,
-
we have to make sure
everybody's comfortable
-
with this type of technology.
-
Thank you.
-
So now I'm gonna turn it
over to some of the folks
-
who have called for to speak.
-
Hope we're gonna try and get through
-
as many of them as possible.
-
Commissioner, these are the
folks who helped to push
-
and move this ordinance.
-
We really hope you can
stay as long as you can
-
to hear them, also.
-
I'm looking forward to
working with the folks
-
in the community.
-
Wonderful, thank you.
-
So Joy.
-
Do I?
-
Hello?
-
Oh, wonderful, Joy.
-
The floor is yours.
-
If you could take about
three minutes or no more.
-
Sounds good.
-
So thank you so much madam chair Edwards,
-
members of the committee
on government operations
-
and members of the Boston City Council
-
for the opportunity to testify today,
-
I am the founder of the
Algorithmic Justice League
-
and an algorithmic bias researcher.
-
I've conducted MIT study
-
showing some of the
largest recorded gender
-
and racial biases in AI
systems sold by companies,
-
including IBM, Microsoft and Amazon.
-
As you've heard the deployment
of facial recognition
-
and related technologies
-
has major civil rights implications.
-
These tools also have
technical limitations
-
that further amplify
harms for black people,
-
indigenous people, other
communities of color,
-
women, the elderly, those
with dementia and Parkinson's,
-
youth trans and gender
non-conforming individuals.
-
In one test I ran,
-
Amazon's AI even failed on
the face of Oprah Winfrey,
-
labeling her male.
-
Personally, I've had to resort
to wearing a white mask,
-
to have my dark skin detected
by some of this technology,
-
but given mass surveillance applications,
-
not having my face
detected can be a benefit.
-
We do not need to look to China
-
to see this technology
being used for surveillance
-
of protesters with little
to no accountability,
-
and too often in violation of
our civil and human rights,
-
including first amendment
rights of freedom of expression,
-
association, and assembly.
-
When the tech works,
-
we can't forget about
the cost of surveillance,
-
but in other contexts it can fail
-
and the failures can be harmful.
-
Misidentifications can
lead to false arrest
-
and accusations.
-
In April 2019, a brown university senior
-
was misidentified as a terrorist suspect
-
in the Sri Lanka Easter bombings.
-
he police eventually
corrected the mistake,
-
but she still received death threats.
-
Mistaken identity is more
than an inconvenience.
-
And she's not alone, in the UK,
-
the faces of over 2,400 innocent people
-
were stored by the police
department without their consent.
-
The department reported
-
a false positive identification
rate of over 90%.
-
In the U.S. there are no
reporting requirements
-
generally speaking, so
we're operating in the dark.
-
Further, these tools
do not have to identify
-
unique basis to be harmful.
-
And investigation reported
-
that IBM equipped the NYP
-
with tools to search for people in video,
-
by facial hair and skin tone.
-
In short, these tools can be used
-
to automate racial profiling.
-
The company recently came out
-
to denounce the use of these tools
-
for mass surveillance and profiling.
-
IBM's moved to stop selling
facial recognition technology
-
underscores it's dangerous.
-
Due to the consequences of failure,
-
I've focused my MIT research
-
on the performance of
facial analysis systems.
-
I found that for the task
of gender classification,
-
IBM, Microsoft and Amazon
-
had air rates of no more
than 1% for lighter skin men.
-
In the worst case,
-
these rates were to over
30% for darker skin women.
-
Subsequent government
studies compliment this work.
-
They show that continued air
disparities in facial analysis
-
and other tasks, including
face recognition.
-
The latest government
study of 189 algorithms,
-
revealed consequential racial gender
-
and age bias in many of
the algorithms tested.
-
Still, even if air rates improve,
-
the capacity for abuse, lack of oversight
-
and deployment limitations
post too great a risk.
-
Given known harms, the city of Boston
-
should ban government
use of face surveillance.
-
I look forward to
answering your questions.
-
Thank you for the opportunity to testify.
-
Thank you so much
-
and to correct my horrible
mispronunciation of your name.
-
Would you mind saying your
full name for the record again?
-
Sure, my name is Joy Buolamwini.
-
Thank you very much Joy.
-
I'm gonna now turn over
to My'Kel McMillen,
-
a youth advocate.
-
I'm also gonna set the
clock three minutes, My'Kel.
-
God afternoon madam chair,
-
good afternoon city councilor,
-
good afternoon everybody else.
-
My name is My'Kel McMillen,
I'm a youth advocate
-
I'm also an organizer [indistinct]
which is from the east.
-
I just want to talk about on a little bit
-
about what's going on within
the streets of Boston.
-
For centuries black folks
-
have been the Guinea pig in America,
-
from experiments with our bodies
-
to the strain on our communities,
-
it seems [indistinct]
-
Do I agree police are as involved?
-
Yes, there are no longer
catching us glaze,
-
but catching poor black and brown folks.
-
On January, so I wanna
talk about three years ago,
-
roughly January around 2017,
-
there were residents from my
neighborhood that came up to me
-
that they asking a drone
flying into the area
-
and they asked numerous times
-
and nobody could figure out
who was flying the drone,
-
who's controlling it.
-
And so upon receiving his information,
-
I live across the street
from a soda company,
-
and later on that night,
-
I seen officers playing with a drone
-
and I ended up taking the
photos of their drone.
-
And it kind of jogged my memory back
-
that this was a drone
-
that lots of residents in my neighborhood
-
had seen and talked about.
-
That no one knew who was flying the drone.
-
It comes to my knowledge,
it was the police.
-
So with doing some research
-
and reaching out to a friend over at ACLU
-
and we try to follow up
a public records request,
-
which we were denied our first time,
-
and then eventually got it,
-
where the Boston police has
spent the amount of $17,500
-
on three police drones.
-
And nobody knew about it from
city council to the public.
-
It was kind of kept secret
-
and there was no committee oversight.
-
And these drones were flying illegally
-
in the side of a neighborhood
and the breach of privacy,
-
The not knowing what
information was being collected,
-
what was being stored,
how it was being stored
-
that really scared a lot
of folks in my neighborhood
-
and to the point that a lot
of them didn't wanna speak up
-
because of past experience
of dealing with abuse
-
and being harassed.
-
Nobody wanted that backlash,
-
but at the same time,
-
we have to be as residents
-
and have to empower one
another and have to speak up.
-
And sometimes it has to
be somebody regardless
-
if it's myself or another individual,
-
we have to hold people accountable.
-
And something that Monica
Candace, who's another activist,
-
accountability has no colors to another,
-
regardless if you're white,
black in a blue uniform,
-
you still have to be held accountable.
-
And then just to see that
no one was held accountable
-
was a scary thing,
-
because it kind of just [indistinct]
-
Nobody ever gets caught accountable
-
because our voices are muffled by money
-
or just a knee in the neck,
rest in peace to George Floyd.
-
That is it for my time, thank you all.
-
Thank you, thank you so much.
-
Perfect testimony.
-
And I really appreciate you
speaking from the heart,
-
from your own experience
-
and also demonstrating how you've actually
-
had conversation for accountability
-
through your leadership.
-
So I wanna thank you for that.
-
Up next we have Karina Hem
-
from the Student Immigrant Movement.
-
Three minutes, Karina.
-
Hi everyone.
-
Thank you everyone for joining
us today at this hearing,
-
my name is Karina Hem,
-
and I'm the field organizer
-
for the Student Immigrant Movement.
-
I would like to also thank our partners
-
who've been working on the
facial recognition ban,
-
city councilors, Michelle
Wu, Ricardo Arroyo,
-
Andrea Campbell, Kim Janey.
Annesa Essaibi George,
-
along with ACLU of Massachusetts,
-
Muslim Justice League
and Unafraid Educators.
-
So SIM is a statewide
grassroots organization
-
by and for the undocumented youth.
-
We work closely with our
members or our young people
-
to create trusting and
empowering relationships.
-
At SIM we fight for the
permanent protection
-
of our undocumented
youth and their families
-
through collective action.
-
And one of the major role of the work
-
in the organizing that we do,
-
is to challenge these systems
that continue to exploit
-
and disenfranchise the undocumented
-
and immigrant communities.
-
So I am here on behalf of SIM
-
to support the ban on facial recognition
-
because it's another system,
-
it's another tool that will intentionally
-
be used against marginalized community.
-
So the undocumented immigrant community
-
is not mutually exclusive
to one particular group.
-
We have transgender people,
-
black, Latina, exmuslim,
people with disabilities,
-
and it's so intersectional.
-
And so the stakes for getting
this identified are high
-
because this creates a dangerous situation
-
for people who are undocumented.
-
And these programs are
made to accurately identify
-
white adult males, living
black and brown folks
-
at risk on top of living
in marginalized community.
-
And so many of the
young people I work with
-
are students who are in college
-
and many are still in high school.
-
Face surveillance technology
is not meant for children,
-
and it's not meant for the young people
-
and law enforcement already uses it
-
to monitor the youth in places
like Lockport, New York.
-
Their face surveillance system
-
has been adopted in their school district,
-
and they've already spent
over a million dollars
-
in the system.
-
So you ask yourself,
-
but what is exactly the intent
to have these surveillance?
-
Do you still have the
right to their privacy
-
and have the right to live
without constant infringement
-
and interference of law enforcement.
-
If another city allows facial surveillance
-
in their schools to happen,
-
the chance of that occurring
here is high as well
-
if we do not ban facial
recognition in Boston.
-
And we believe this because the youth,
-
the undocumented youth,
-
there are already not
protected in their schools,
-
school disciplinary reports,
-
which currently have no guideline
protocol or clear criteria
-
on what exactly school officers can write
-
are sent to the superintendent
and their designee
-
to sign off and sent to the
Boston Police Department.
-
And still we've seen that
some of these reports
-
actually go straight to
the police department
-
with no oversight.
-
And so this endangers the
life of immigrant youth,
-
because Boston Police
Departments share information
-
and databases with agencies like I.C.E.
-
So it creates a gateway to imprisonment
-
and deportation from any
of these young folks.
-
And so adding facial
surveillance into schools
-
and in the city would be used
against those same students
-
who are already at risk
-
for being separated from their community.
-
We know that black and brown youth
-
experience depression every day,
-
and they already are deemed
suspicious by law enforcement
-
based on their skin color
and the clothes they wear,
-
the people they interact with in more.
-
And so adding facial recognition,
-
that's not even accurate to
surveil our youth and families
-
will just be used to justify
their arrests or deportation.
-
And through the policing system,
-
we were already asking for police officers
-
to engage with the youth in ways
-
that are just harmful to
the youth development.
-
The city of Boston needs to
stop criminalizing young folks,
-
whether they engage in
criminal activity or not,
-
arresting them will not get
to the root of the problem.
-
There are so many options
-
that are much safer and
more secure for our youth,
-
for the families and their futures.
-
Banning facial recognition
would meet law enforcement
-
has one less tool to
criminalize our youth.
-
Thanks so much everyone.
-
Thank you.
-
Commissioner, I understand
time it'd be of the essence,
-
but I'm really hoping you can
stay till about 4:15, 4:30,
-
please, please.
-
I can stay until four, that's it.
-
I have to go at four, that long.
-
Okay, so-
-
I appreciate everyone's testimony as well.
-
I have a team we're taking notes.
-
So I don't think when I go
that no one's taking notes.
-
We would really wanna meet again
-
and discuss on this technology.
-
I know that there are some councilors
-
that have some clarifying questions
-
and we have three more people to speak.
-
And so I am begging you, please,
-
if you can at least get
to the lead sponsors,
-
councilor Wu and councilor Arroyo.
-
That'll be three, six, nine, 10.
-
That'd be probably 20 more minutes.
-
A little after four.
-
Okay.
A little close for me.
-
[Gross laughs]
-
We're trying, we're trying.
-
I understand, I understand, go ahead.
-
Thank you, Kade.
-
Thank you chair.
-
My name is Kade Crawford.
-
I am the director of the
Technology for Liberty Program
-
at the ACLU of Massachusetts.
-
Chair Edwards, members of the committee
-
and members of the city council,
-
thank you all for the opportunity
-
to speak on this crucial issue today.
-
We are obviously in the midst
of a massive social upheaval
-
as people across Boston and the nation
-
demand that we ensure black lives matter
-
by shifting our budget
priorities away from policing
-
and incarceration and
towards government programs
-
that lift people up instead
of holding them back.
-
And people are outraged
because for too long
-
police departments have
been armed to the teeth
-
and equipped with military
style surveillance technologies,
-
Like the ones that My'Kel just referenced.
-
Even as our schools lack
the most basic needs,
-
like functioning water
fountains, nurses and counselors.
-
And even now during this pandemic,
-
as our healthcare providers
lack the PPE they need
-
to protect themselves and us,
-
but our government cannot continue to act
-
as if it is at war with its residents,
-
waging counter-insurgency surveillance
-
and control operations,
-
particularly in black
and brown neighborhoods
-
and against black and brown people.
-
We must act, we must not merely
speak to change our laws,
-
to ensure we are building a free,
-
just an equitable future
for all people in Boston.
-
So that said,
-
banning face surveillance
in Boston is a no-brainer.
-
It is one small but vital piece
-
of this larger necessary transformation.
-
Face surveillance is
dangerous when it works
-
and when it doesn't.
-
Banning this technology in Boston now
-
is not an academic issue.
-
Indeed the city already uses technology
-
that with one mere software upgrade
-
could blanket Boston in the
kind of dystopian surveillance
-
currently practiced by
authoritarian regimes
-
in China and Russia.
-
And as Joyce said,
-
even some cities right
here in the United States.
-
Boston has to strike a different path
-
by protecting privacy, racial justice,
-
and first amendment rights.
-
We must ban this technology now,
-
before it creeps into our
government, in the shadows,
-
with no democratic control or oversight
-
as it has in countless other cities,
-
across the country and across the world.
-
So today you already
heard from pretty much
-
the world renowned expert
-
on racial and gender bias
and facial recognition.
-
Thank you so much Joy
for being with us today,
-
your work blazed a trail
and showed the world
-
that yes, algorithms
can in fact be racist.
-
You also heard from my My'Kel McMillen,
-
a young Boston and resident
who has experienced firsthand,
-
what draconian surveillance
-
with no accountability looks like.
-
Karina Ham from the
Student Immigrant Movement
-
spoke to the need to
keep face surveillance
-
out of our public schools,
-
where students deserve to be
able to learn without fear.
-
You're gonna hear a
similar set of sentiments
-
from Erik Berg of the
Boston Teachers Union
-
and shortly medical doctor
and infectious disease expert,
-
Joshua Barocas will testify
-
to how this technology in Boston,
-
would inevitably undermine trust
-
among the most vulnerable people
-
seeking medical care and help.
-
Obviously that's the last
thing that we need to do,
-
especially during a pandemic.
-
For too long in Boston,
city agencies have acquired
-
and deployed invasive
surveillance technologies
-
with no public debate, transparency,
-
accountability or oversight.
-
In most cases, these technologies,
-
which as My'Kel said, include drones,
-
but also license plate readers,
-
social media surveillance,
video analytics software
-
and military style cell
phone spying devices,
-
among many others.
-
These were purchased and deployed
-
without the city council's
knowledge, let alone approval.
-
We have seen this routine play out
-
over and over and over for years.
-
This is a problem with all
surveillance technologies,
-
but it is especially unacceptable
-
with a technology as dangerous
-
and dystopian as face surveillance.
-
It is not an exaggeration to say that
-
face surveillance would,
-
if we allow it destroy privacy
-
and anonymity in public space.
-
Face surveillance technology
-
paired with a thousands of
network surveillance cameras
-
already installed throughout the city,
-
would enable any official
with access to the system
-
to automatically catalog the movements,
-
habits and associations of
all people at all times,
-
merely with the push of a button.
-
This is technology that
would make it trivial
-
for the government to
blackmail public officials
-
for seeking substance use treatment,
-
or to identify whistleblowers
-
who are speaking to the Boston Globe
-
or in its most routine manifestation
-
to supercharge existing police harassment
-
and surveillance of
black and brown residents
-
merely because of the color of their skin
-
and where they live.
-
Using face recognition tech,
-
the police could very easily
take photos or videos of people
-
at any of these Black Lives
Matter demonstrations,
-
run those images through
a computer program
-
and automatically populate a list
-
of each person who attended
-
to express their first amendment rights
-
to demand racial justice.
-
The police could also
ask a computer program
-
to automatically populate
a list of every person
-
who walked down a specific
on any given morning
-
and share that information with I.C.E.
-
It could also be used
-
to automatically alert law enforcement,
-
whenever a specific person passes
-
by a specific surveillance
camera anywhere in the city.
-
And again, there are
thousands of these cameras
-
with more going up each week.
-
Yes, so we're, I'm so sorry.
-
You're over the three minutes.
-
Well, over the three minutes,
-
but I'm I...
-
and I have two more people
plus trying to get this.
-
So I'm just letting you
know if you could summarize.
-
All right, I'm almost done, thank you.
-
Sorry, chair.
-
So I'll just skip to the end and say,
-
we're at a fork in the road right now,
-
as Joy said, you know, major corporations,
-
including IBM and Google are now declining
-
to sell this technology
and that's for good reason,
-
because of the real
potential for grave human
-
and civil rights abuses.
-
We the people, if we live in a democracy
-
and need to be in the driver's seat
-
in terms of determining
-
whether we will use these technologies.
-
We can either continue
with business as usual,
-
allowing governments to adopt
-
and deploy these technologies
unchecked in our communities,
-
our streets, in our schools,
-
or we can take bold
action now to press pause
-
on the government's
use of this technology,
-
to protect our privacy
-
and to build a safer fewer
freer future for all of us.
-
So please, I encourage the city council
-
to join us by supporting
this crucial ordinance
-
with your advocacy and with your vote
-
and I thank you all for
your public service.
-
Thank you very much,
Erik Berg, three minutes.
-
Yeah, thank you.
-
And I'll get right to the
point in the interest of time.
-
Thank you to the council
-
for taking this up and for
allowing this time today.
-
And I'm here to speak today
-
on behalf of the Boston Teachers Union
-
and our over 10,000 members
in support of the ordinance,
-
banning facial recognition
technology in Boston
-
that was presented by
councilors Wu and Arroyo.
-
We strongly oppose the
use of this technology
-
in our public schools.
-
Boston public schools
should be safe environments
-
for students to learn,
-
explore their identities
and intellect and play.
-
Face surveillance technology
threatens that environment.
-
The technology also threatens
the rights of our BTU members,
-
who must be able to go to work
-
without fearing that their every movement,
-
habit and association will
be tracked and cataloged.
-
To our knowledge,
-
this technology is not
in use in our schools,
-
but we've already witnessed
some experimenting with it.
-
Although it was unclear
if it was authorized.
-
Two summers ago, members of
the Boston Teachers Union,
-
who were working in the summer program,
-
contacted the union to let us know
-
that they were being asked to sign in
-
using an app called Tinder
with face recognition features.
-
And the central office
-
didn't seem to know about this program,
-
and to this day,
-
we don't know how it came about.
-
While the district quickly stopped
-
using the photo portion of this app
-
and informed the union that
all photos have been deleted,
-
this incident is indicative
-
of how easy it is for private
security or HR companies
-
to sell a technology to a
well-intentioned principal
-
or superintendent who
may not have expertise
-
in the tech field.
-
Face surveillance in schools
transforms all students
-
and their family members,
as well as employees
-
into perpetual suspects,
-
where each and every
one of their movements
-
can be automatically monitored.
-
The use of this technology
in public schools
-
will negatively impact students' ability
-
to explore new ideas,
express their creativity,
-
and engage in student dissent
-
and especially disturbing prospect
-
given the current youth led protests
-
against police violence.
-
Even worse, as we've heard,
-
the technology is frequently
biased and inaccurate,
-
which raises concerns
-
about its use to police students of color,
-
academic peer reviewed studies,
-
show face surveillance algorithms
-
are too often racially based,
-
particularly against black women
-
with inaccuracy rates up to
35% for that demographic.
-
We know black and brown students
-
are more likely to be punished
for perceived misbehavior.
-
Face surveillance will only perpetuate
-
and reproduce this situation.
-
When used to monitor children,
-
this technology fails
in an essential sense
-
because it has difficulty
-
accurately identifying young
people as their faces change.
-
Research that tested five top performing
-
commercial off the shelf
face recognition systems
-
shows there's a negative bias
when they're used on children,
-
they perform poor on
children than on adults.
-
That's because these systems are modeled
-
through the use of adult faces
-
and children look different from adults
-
in such a way they can not be considered,
-
they're simply scaled down versions.
-
On top of this face,
-
surveillance technology regularly
-
misgenders transgender people,
-
and will have a harmful impact
-
on transgender young
people in our schools.
-
Research shows that
automatic gender recognition
-
consistently views gender
in a trans exclusive way.
-
And consequently carries
disproportionate risk
-
for trans people subject to it.
-
At a time when transgender children
-
are being stripped of their
rights at a national level,
-
Boston must protect transgender
kids in our schools.
-
Moreover, face surveillance in schools
-
will contribute to the
school to prison pipeline,
-
threatening children's welfare,
-
educational opportunities
and life trajectories.
-
Already, children from
marginalized communities
-
are too often funneled
out of public schools
-
and into the juvenile and
criminal justice systems -
-
will inevitable this pipeline.
-
I'll get right to the end.
-
Finally, face surveillance technology
-
will harm immigrant families.
-
In this political climate,
-
immigrants are already
fearful of engagement
-
with public institutions
-
and face surveillance systems
-
would further chill student
and parent participation
-
in immigrant communities in our schools,
-
Boston schools must be
welcoming and safe spaces
-
for all families.
-
The city of Boston must take action now
-
to ensure children and BTU workers
-
are not subject to this unfair,
-
biased and chilling scrutiny.
-
In order to protect young people
-
in our educational community,
-
we must stop face surveillance
in schools before it begins.
-
Thank you very much for your attention
-
and your consideration.
-
Thank you very much.
-
We're getting there,
-
very, very close commissioner I promise.
-
Just one more person.
-
Then the two people who specifically said
-
they had some questions,
we'll go right to them.
-
Joshua Barocas
-
Yeah, I will read quickly.
-
First of all, thanks for allowing me
-
to testify today regarding
this important issue.
-
I should say that the views
-
that I'm expressing today are my own
-
and don't necessarily
represent those of my employer,
-
but that said I'm an
infectious disease physician
-
and an addictions
researcher here in Boston.
-
As such, I spend a large
majority of my time working
-
on solutions to improve
the health and wellbeing
-
of vulnerable populations,
-
including people who are
experiencing homelessness,
-
people with substance use disorders.
-
One thing that we know
is that stigma and bias
-
are pervasive throughout our community
-
and lead to disparities and
care for these populations.
-
These disparities were laid bare
-
and exacerbated by the
ongoing COVID-19 pandemic.
-
Well, we're all searching for answers
-
on how to best protect
ourselves from this virus,
-
I truly fear that the use of
facial recognition technology
-
will only serve to exacerbate
existing disparities,
-
including those of racial,
gender and socioeconomic.
-
If we use this very nascent
and imperfect technology
-
to track individuals, I'm concerned
-
that it will only serve
the marginalized people
-
and worse in health outcomes.
-
We've heard about the low accuracy rates.
-
We know about systemic racism,
-
as a public health issue
-
that's fueled by stigma
bias and misinformation.
-
Additionally, Boston
provides expansive services
-
for substance use disorders,
including methadone treatment,
-
and it's imperative that we
ban facial recognition software
-
to ensure that people seeking
-
substance use disorder
treatment and mental health care
-
among other stigmatized health
services may do so privately
-
and without fear.
-
If we allow this on many
of our public cameras,
-
it would enable the government
-
to automatically compile lists of people
-
seeking treatment for
substance use mental health
-
and as I said, other stigmas
stigmatized health conditions.
-
This would have a chilling effect
-
and disproportionately affect
-
our most vulnerable populations.
-
I'll stop there and ask that
we consider banning this.
-
Thank you so very much.
-
I'm gonna turn it over.
-
I misspoke earlier the lead
sponsor, there's Michelle Wu,
-
and she mentioned that she has
to move along very quickly.
-
So I wanted to, if it's
okay, councilor Arroyo,
-
I'm gonna go ahead and go
to councilor Wu real quick,
-
for specific questions.
-
Your good!
Yeah, absolutely.
-
Councilor Wu.
-
Thank you madam chair.
-
And thank you so much commissioner
-
for making the time to be here.
-
We all know how many demands
there are in your time.
-
And now we're asking for even
more than you had alluded.
-
So very, very grateful that it means a lot
-
that you took the time and
are the one at this hearing.
-
I also wanna note that you
were the one at the hearing
-
in June, 2018 as well.
-
So we know that you've been
part of this conversation
-
and supportive of this
general idea for a long time.
-
So just to clarify on
some of what you said,
-
you mentioned that you,
-
you referenced the technology upgrades,
-
that some current vendors
that BPD works with
-
have for face surveillance.
-
So just to clarify,
-
has BPD upgraded to the next
version of that software
-
and we're just choosing
-
not to use the facial
recognition components of it,
-
or has the upgrade not been accepted
-
because it includes facial recognition?
-
To my knowledge, we have not upgraded,
-
but we anticipate that we will have to,
-
but we will not be using any components
-
of facial recognition.
-
And I guess the technology
just isn't efficient enough -
-
When do you think the
upgrade would likely happen?
-
No, I will get back to you.
-
And trust me, I wanna
have a further meeting
-
so I can have my subject
matter experts here.
-
And I'm gonna tick through,
-
'cause I wanna try to get you out of here
-
as quick as possible.
-
Secondly, so you drew a distinction
-
between face surveillance systems
-
and facial recognition technology.
-
So just to clarify,
-
does BPD, we don't use that.
-
You don't own it right now
-
because you haven't done that upgrade,
-
but does BPD work with other
state or federal entities
-
that have face surveillance
-
or facial recognition technology?
-
Do you ever use the
results of that technology
-
on any sort of databases?
-
To answer you more accurately,
-
I would have to find out.
-
I believe the state police
-
does have some form of facial recognition
-
and maybe the registry to.
-
Okay, the registry of motor vehicles.
-
Yes.
-
And so when state police or
-
we'll find out more potentially BPD
-
works with the RMB to
run matches of photos
-
against their database.
-
Do you need any sort of warrants
-
or other approvals to do that before,
-
for let's say the state?
-
So I used to be part of
the Bureau of Investigative
-
Services, to my knowledge,
-
it's only utilized to help
assist in photo arrays.
-
So nothing past that.
-
Okay, could you just clarify
what you mean by that?
-
So if you a victim of a crime
-
and we don't have that
person under arrest,
-
what do we do have a potential suspect?
-
You have to pick,
-
you were allowed the opportunity
-
to try to identify the
suspect of the crime.
-
And so to be fair,
-
to ensure that no innocent
person is selected,
-
you put the suspects picture
-
along with several other suspects pictures
-
up to seven or more,
-
and then you see if
the victim of the crime
-
can pick the suspect
out of that photo array.
-
Okay, and so the RMB
uses the face recognition
-
to provide those burdens.
-
I wanna give you an accurate answer.
-
So again, in our next meeting,
you'll have your answer.
-
Thank you.
-
Are you aware of any state
or federal regulations
-
that provide any sort
of kind of restrictions
-
or reigning in guidance when it comes
-
to facial recognition technology
-
or face surveillance systems?
-
Nope.
-
Okay, and so given the absence
of guidelines right now
-
with the federal state or city level
-
and the need to upgrade and
the use by state police,
-
for example, are you supportive,
-
we're hearing you loud and clear
-
that you'd like to have
further conversation,
-
we just wanna get your opinion now,
-
are you supportive of a city level ban
-
until there are policies
in place potentially,
-
at other levels or
specifically at the city level
-
to reign in surveillance overall?
-
Yes, I am.
-
I've been clear for four years.
-
We need your input, your guidance.
-
And I thank you to everyone
that testified today.
-
I have a team taking notes
-
because I didn't forget
that I'm African-American
-
and I could be misidentified as well.
-
And I believe that we have
one of the most diversified
-
populations in the history of our city.
-
And we do have to be fair to everyone
-
that is a citizen of Boston.
-
We don't want anyone misidentified.
-
And again we will work with everyone here
-
because this is how we will be educated.
-
Yes, we're very grateful for your time.
-
I'm gonna see to the co-sponsor.
-
Thank you.
-
So, actually, is that fine chair?
-
I just wanna thank you commissioner
-
for supporting the facial
recognition surveillance ban
-
and for taking the time to
be here for the questions.
-
Councilor Wu asked many of them,
-
I have one specific one which is,
-
has the BPD used facial
recognition in the past,
-
in any capacity?
-
No, not to my knowledge.
-
And I will review that, but
not to my knowledge at all.
-
And BRC, does BRC have access
-
to facial recognition techniques?
-
Nope, not at all.
-
We don't use it and I
anticipated that question
-
and we make sure that
we're not a part of any,
-
BRC does not have facial recognition.
-
Okay, I think every other question
-
that I had for you today
was asked by councilor Wu.
-
So I just thank you for
taking the time to be here
-
and for supporting this, thank you.
-
Thank you, and before I go,
-
for the record, nothing
has changed in my opinion
-
from four years ago,
-
until this technology
is a hundred percent,
-
I'm not interested in it.
-
And even when it is a hundred percent,
-
you've committed to a
conversation about it?
-
Absolutely, we discussed that before.
-
That you have.
-
It's not something that
you can just implement.
-
[crosstalk]
-
It's important that we also get
the input from the community
-
that we serve and work in partnership
-
so we can of course have
a better quality of life.
-
And hat's inclusive of everyone's
expectation of privacy.
-
So before you go,
-
I'm gonna do this one call out
the panelists that just spoke
-
or to any city councilors.
-
One question that you may
have for the commissioner
-
that you need to ask,
do any of you have that?
-
Do either Joy, My'Kel,
Karina, Kade, Erik, Joshua,
-
or to the other city councilors.
-
I'm trying to be mindful of his time
-
but I also understand
-
you guys also gave time
to be here today as well.
-
And if you wanted to ask him
representing BPD a question.
-
I see Julia Mejia has raised her hand,
-
to the panelists though,
-
who just spoke any questions?
-
I'm gonna go now to my
colleagues to ask one question.
-
So that allows for him to go.
-
Okay, councilor Mejia
-
Yes, I just wanted to quickly
thank commissioner Gross
-
for your time and just
your steadfast leadership.
-
But I do have just a quick
question before you go.
-
I just need some clarity,
-
is that you mentioned that
you hope to keep this hearing
-
on the topic of the ordinance
-
and find other ways to adjust
issues of police relations.
-
So a lot of people draw the direct link
-
between facial recognition
-
and relationships with the community.
-
Do you see that link?
-
And can you talk about
how the police department
-
sees that link?
-
I'm just really curious about
-
how do we build relationships
with the community
-
while also thinking
about facial recognition?
-
Like, how do you reconcile that?
-
I can answer that,
-
the information I was given before
-
that we were going to
discuss many subjects.
-
And so I thank you for
eliminating this one,
-
but the only way you
increase relationships
-
with the community is to have
-
hold hard discussions like this one.
-
You have to listen to the
people that you serve,
-
because everyone throws around
-
the moniker community policing.
-
I believe that's become jaded.
-
And so I've created a bureau
of community engagement
-
so that we can have these discussions.
-
And again, I don't forget my history.
-
I haven't forgotten my history.
-
I came on in 1983 and I've
gone through a lot of racism,
-
was in a tough neighborhood growing up,
-
and I didn't forget any of that.
-
And as you alluded to earlier,
some of the city councilors,
-
I'm always in the street
talking to people,
-
I get a lot of criticism of
law enforcement in general,
-
but I believe in if you
want change, be the change,
-
you can only change with the people.
-
So, I am looking forward
to further conversation
-
and to listening to testimony
on how we can improve
-
our community relations,
-
especially not only throwing
the time civil unrest,
-
but we're in the middle
of a pandemic as well.
-
And so police departments
-
should not be separated
from the community.
-
You have to have empathy,
sympathy, care and respect.
-
A lot of people have lost jobs,
-
and we must keep in mind
about socioeconomics,
-
fairness for employment.
-
A lot of those things
-
directly affect the
neighborhoods of color.
-
And I'm glad that we are increasing
-
our diversity in the communities.
-
And we currently currently
have organizations in Boston
-
that directly speaks to the people.
-
The Latino Law Enforcement Organization,
-
the Cabo Verde Police Associations,
-
the Benevolent Asian Jade Society.
-
Our department is increasing in diversity,
-
and the benefit of that,
-
the benefit of having representation
-
from every neighborhood we serve,
-
is that it will improve relationships.
-
So again, that's why I'm looking forward
-
to a working session and
my team is taking notes
-
because everybody's everybody's
testimony is important.
-
Trust me, people like you that worked hard
-
for everyone's rights.
-
I wouldn't be here
-
as the first African-American
police commissioner,
-
if we didn't have folks
-
that exercise their
First Amendment Rights.
-
Thank you, commissioner Gross,
-
I have a clarifying question.
-
And then I just wanted to note,
-
this impacts the staying in.
-
So for folks who,
commissioner Gross has alluded
-
to a working session,
-
just to give some clarification
for those watching,
-
that is when we actually get down
-
to the language of the ordinance
-
and work down to the
comments is what I say
-
and how to do that.
-
And so what commissioner Gross is asking
-
that we have that conversation
within the next 60 days,
-
which I'm fine with doing that,
-
I'll just check with the lead sponsors.
-
I did wanna make sure I understood
-
what in terms of timing is
going on with the contract,
-
is BPD entering into a contract
that has this technology?
-
There was a timing issue
-
that I thought was warranting this hearing
-
happening very fast.
-
Maybe the lead sponsors could
also answer this question.
-
I was confused,
-
if someone could explain what was the...
-
there's a sense of urgency
that I felt that I was meeting
-
and I'm happy to meet it.
-
Just if someone either, I
saw you Kade, you nodded.
-
Let me just figure out what
this contract issue is Kade.
-
Sure councilor, I'd be
happy to address that.
-
We obtained public records
from the city a while back
-
showing that the Boston Police Department
-
has a contract with a
company called BriefCam
-
that expired on May 14th.
-
And the contract was for version
4.3 of BriefCam software,
-
which did not include facial
surveillance algorithms.
-
The current version of
BriefCam's technology
-
does include facial
surveillance algorithms.
-
And so one of the reasons
that the councilors
-
and the advocacy groups
behind this measure
-
were urging you chair
to schedule this quickly
-
is so that we could pass this ban fast
-
to ensure that the city
-
does not enter into a contract
to upgrade that technology.
-
Thank you, commissioner Gross,
-
where are we on this contract?
-
Yep and thank you.
-
That's exactly what I was gonna comment.
-
The older version of BriefCam
-
did not have facial recognition
-
and I believe that the newer version does.
-
BriefCam works on the old version
-
on condensing objects, cars,
-
but I would definitely
check on that contract
-
because I don't want any technology
-
that has facial recognition
-
and that's exactly what
we were gonna check on
-
and so my answer right
now is no at this point.
-
I've just read into testimony,
-
if we obtain technology such as that,
-
I should be speaking to all of you
-
and to what that entails,
-
if we have BriefCam and
it has facial recognition
-
[indistinct] that allows
for facial recognition, no.
-
And if someone did, if we
did go into a contract,
-
would I have the ability to show you
-
that that portion that does
have facial recognition,
-
that that can be censored?
-
That that is not, excuse
me, poor choice of words
-
that that can be excluded.
-
And so I'm not comfortable
with that contract
-
until I know more about it.
-
I don't want any part
of facial recognition.
-
But as I read into testimony,
-
all the technology
that's going forward now
-
in many fields is like,
-
hey we have facial recognition
-
and I'm like not comfortable with that.
-
And so BriefCam, yeah the old version,
-
we hardly ever used it.
-
And there is discussion on the new version
-
and I'm not comfortable
-
with the facial recognition
component of that.
-
Thank you very much.
-
And thank you very much
Kade for the background
-
and understanding both of
you, and understanding.
-
I felt a sense to move
fast and I'm moving fast,
-
but I was trying to make
sure I understood what for.
-
I really do have to go.
-
Yes, thank you very much.
-
Thank you all for your input.
-
Will there someone be taking
notes or is there someone,
-
I guess Neil from the mayor's
office will be taking notes.
-
And I have my own team
taking notes right now.
-
Excellent, thank you very much.
-
Thank you for your testimony.
-
If it weren't for
advocates, such as everyone
-
on this Zoom call, I wouldn't
be here in this capacity.
-
So thank you.
-
Thank you very much commissioner.
-
Thank you everyone take care.
-
Okay, thank you very much.
-
And so we have a still a
robust conversation to have
-
amongst ourselves about the
language, about the goals,
-
whether it goes far enough,
if anyone opposes this ban,
-
and I want people to understand
-
if there is someone who
disagrees with a lot of us,
-
this will be a respectful conversation
-
and that person will be
welcome to have their opinion
-
and express it as every single one of us
-
has been able to do.
-
So I'm gonna continue now,
-
there's a list of folks
who have signed up to speak
-
and I'm gonna continue down that list
-
and keep them to no
more than three minutes
-
and then we're gonna open up.
-
Oh, I'm so sorry, I apologize.
-
Before I go down a list,
I know my councilor,
-
my colleagues also may
have questions or concerns
-
or may wanna voice certain
things about the legislation.
-
And I apologize.
-
[Lydia giggles]
-
So I'm just so amped to get
to the public testimony.
-
So I'm gonna go ahead and
go in order of arrival,
-
the two lead sponsors have
asked questions already
-
of the commissioner.
-
Did they have the councilor
Wu or councilor Arroyo
-
have any questions for the
panelists that just spoke?
-
If not, I'll move on
to my other colleagues.
-
MICHELLE: I'm happy to
defer to colleagues,
-
thank you madam chair.
-
Very well, councilor Arroyo.
-
The only question I would have,
-
which is more to, I
think Kade from the ACLU,
-
is whether or not she has any idea
-
where we are on that contract.
-
From what I heard from commissioner Gross,
-
it sounded like he couldn't confirm
-
the timeline for that contract,
-
whether or not it's been
signed or not signed,
-
what's going on with that contract.
-
Does any panelists here
have any information
-
on whether or not that contract is,
-
what the status of that is?
-
Thanks for the question, councilor Arroyo.
-
Regrettably no.
-
We filed a public records request
-
with the Boston Police
Department on May 14th,
-
so that was about a month ago,
-
asking just for one simple document
-
for the existing contract
-
that the Boston Police
Department has with BriefCam
-
and they haven't sent it to us yet.
-
So, and we've prided them
multiple times about it,
-
including on Friday saying,
-
it would be a real shame
-
if we had to go to this hearing on Tuesday
-
without this information and
we still haven't received it,
-
so we do not.
-
Thank you.
-
That all councilor Arroyo?
-
Very well, councilor Breadon.
-
Thank you.
-
This has been very informative,
-
very good questions from my colleagues.
-
One question I had was, other
agencies, federal agencies,
-
are they using facial recognition?
-
And if they are they
sharing that information
-
with our police department?
-
Does anyone know the
answer to that question?
-
I can speak to that, Joy
Buolomwini if you're still on,
-
you may wanna speak to some of those too.
-
So the ACLU we nationwide have
been filing FOIAR requests,
-
Freedom of Information Act Requests
-
with various federal agencies
-
to learn about how the federal government
-
is using this technology
and how they're sharing
-
information derived from the technology,
-
with state and local law enforcement.
-
We know for example, that
in the city of Boston,
-
the Boston Police Department works closely
-
not only with I.C.E,
-
which has been the subject
of much consternation
-
and debate before this body,
-
but also with the FBI,
-
through something called the
Joint Terrorism Task Force.
-
The Boston Police Department
-
actually has detectives
assigned to that JTTF unit,
-
who act as federal agents.
-
So one concern that we have is that
-
even if the city of Boston banned
-
face surveillance technology
for city employees,
-
certainly it would be
the case that the FBI
-
and other federal agencies
-
would be able to continue
to use this technology
-
and likely to share
information that comes from it
-
with the Boston Police Department
and other city agencies.
-
Unfortunately, there's
nothing we can do about that
-
at the city level,
-
which is why the ACLU
-
is also working with partners in congress
-
to try to address this level
at the federal level as well.
-
Thank you, that answered my question
-
and thank you for your work
on this very important issue.
-
Councilor, madam chair
that's all the questions
-
I have for now.
-
Thank you for very much, councilor Bok.
-
So, Mejia.
-
I'm so sorry, councilor Bok.
-
Thank you councilor Breadon.
-
Next is councilor Bok, I apologize.
-
Thank you, thanks councilor Edwards.
-
My question was actually
just, it's for the panelists.
-
Whoever wants to jump in,
-
the police commissioner
was making a distinction
-
between facial surveillance
and facial recognition systems
-
and what we might be banning or not.
-
And I'm just wondering,
-
I don't know the literature
in this world well enough
-
to know if that sort of a
strong existing distinction,
-
whether you think this piece
of legislation in front of us
-
bans one or the other,
-
whether you think we should
be making that distinction,
-
I'd just be curious for anybody
to weigh in on that front.
-
Got it, so when we hear the
term facial recognition,
-
oftentimes what it
means it's not so clear.
-
And so with the way the
current bill is written,
-
it actually covers a wide range
-
of different kinds of facial
recognition technologies.
-
And I say technologies plural
-
to emphasize we're talking
about different things.
-
So for example, you have face recognition,
-
which is about identifying
a unique individual.
-
So when we're talking
about face surveillance,
-
that is the issue.
-
But you can also have facial analysis,
-
that's guessing somebody's gender
-
or somebodies age or other systems
-
that might try to infer your
sexuality or your religion.
-
So you can still discriminate
-
even if it's not
technically face recognition
-
in the technical sense.
-
So, it's important to have
a really broad definition.
-
You also have face detection.
-
So if you think about weapon systems
-
where you're just detecting
the presence of a face
-
without saying a specific individual
-
that can still be problematic.
-
You also have companies like Face Suction
-
that say, we can infer a criminality
-
just by looking at your face,
-
your potential to be a pedophile,
a murderer, a terrorist.
-
And these are the kinds of systems
-
that companies are attempting
to sell law enforcement.
-
So it's crucial that any legislation
-
that is being written,
-
has a sufficiently broad definition
-
of a wide range of facial
recognition technologies, right?
-
With the plural so that
you don't get a loophole,
-
which says, oh, we're
doing face identification
-
or face verification, which
falls under recognition.
-
So everything we're doing
over here doesn't matter,
-
but it does.
-
So thank you so much for that question
-
because this is an
important clarification.
-
Great, thanks so much.
-
Was there anybody else who
wanted to comment on that front?
-
No, okay, great.
-
Thank you, that was really helpful Joy.
-
Madam chair, I'm mindful
of the public testimony,
-
so that'll be it for me.
-
Thank you very much councilor Bok,
-
councilor Mejia.
-
Thank you again to the panelists
-
for educating us beforehand
-
and always sharing all of
this amazing information
-
and data and research that
informs our thinking every day.
-
I am just curious
-
and at the same time a little bit worried
-
about, just kind of like,
-
how we can prevent this
from ever passing ever.
-
Because it seems to me that
-
the way it's been positioned
is that it's the right now,
-
because it's not accurate,
-
but I'm just curious what happened
-
if you've seen other cities
or just even around the world
-
or any other incidences
-
where this has kind of slipped
in the radar and has passed.
-
I don't know if I'm making
any sense here but...
-
Basically the bottom line is
what I'm trying to understand
-
is that here we are standing firm
-
and banning facial recognition,
-
from what I understand at this point,
-
our commissioner is in agreement,
-
because of the issue of accuracy, right?
-
How can we get ahead of this situation
-
in a way that we can right this ordinance
-
to ensure that regardless of
whether or not it is accurate,
-
that we can still protect our residents.
-
Do you get what I'm trying to say?
-
I do councilor Mejia
and thank you for that.
-
I can address that quickly.
-
The council can't bind
future city councils, right?
-
So I agree with you.
-
I think that this technology is dangerous,
-
whether it works or it doesn't.
-
And I think that's why the city of Boston
-
ought to take the step to
ban its use in government.
-
I can promise you that as advocates,
-
we will show up to ensure
-
that these protections
persist in the city of Boston,
-
as long as I'm alive. [Kade laughs]
-
And I think that's true of many
of my friends and colleagues
-
in the advocacy space.
-
But some cities, some states
-
have considered approaches of
-
for example passing a moratorium
-
that expires after a few years,
-
we did not choose that route here,
-
in fact because of our concerns
-
that I think are identical to yours.
-
It's our view that we should
never wanna live in a society
-
where the government can track us
-
through these surveillance cameras
-
by our face wherever we go
or automatically get alerts,
-
just because I happened
to walk past the camera
-
in a certain neighborhood.
-
That's a kind of surveillance
-
that should never exist in a free society.
-
So we agree that it should
be permanently banned.
-
Thank you for that.
-
And then the other
question that I have is,
-
I know that, where the city council
-
and all of the jurisdiction is all things
-
that deal with the city,
-
but I'm just wondering
what if any examples
-
have you heard of other,
-
like maybe within the private sector
-
that are utilizing or considering
to utilize this as a form?
-
Is there anything, any
traction or anything
-
that we need to be mindful of?
-
Happening outside of city
government, if you will.
-
Joy, do you wanna speak
-
to some of the commercial applications?
-
Sure, I'm so glad you're
asking this question
-
because facial recognition technologies
-
broadly speaking are not
just in the government realm.
-
So right now, especially
with the COVID pandemic,
-
you're starting to see
more people make proposals
-
of using facial recognition
technologies in different ways.
-
Some that I've seen start to surface are,
-
can we use facial recognition
for contactless payments?
-
So your face becomes what you pay with.
-
You also have the case
-
of using facial analysis in employment.
-
So there's a company HireVue
-
that says we will analyze
your facial movements
-
and use that to inform hiring decisions.
-
And guess what we train on
the current top performers.
-
So the biases that are already
there can be propagated
-
and you actually might
purchase that technology
-
thinking you're trying to remove bias.
-
And so there are many ways
in which these technologies
-
might be presented with good intent,
-
but when you look at the ways in which
-
it actually manifests, there are harms
-
and oftentimes harms that
disproportionately fall
-
on a marginalized group.
-
Even in the healthcare system.
-
I was reading research earlier
-
that was talking about ableism
-
when it comes to using some kinds of
-
facial analysis systems
-
within the healthcare context,
-
where it's not working as well
-
for older adults with dementia.
-
So the promises of these technologies
-
really have to be backed
up with the claims.
-
And I think the council can
actually go further by saying,
-
we're not only just talking about
-
government use of facial
recognition technologies,
-
but if these technologies
-
impact someone's life in
a material way, right?
-
So we're talking about
economic opportunity.
-
We're talking about healthcare,
-
that there needs to be oversight
-
or clear restrictions depending
on the type of application,
-
but the precautionary
principle press pause,
-
make sense when this technology
is nowhere near there.
-
Thank you.
-
For that and I'm not
sure if I got the Jabil,
-
but I do have one more,
-
You got the, you got the...
-
[crosstalk]
-
But I'll wait.
-
Sorry, thank you so much
to you both, thank you.
-
Councilor Flaherty.
-
Okay, councilor Campbell said
-
she was having some spotty
issues with her internet.
-
So if she's not here,
-
I'm gonna go ahead and
go to councilor O'Malley
-
who I'm not sure he's still on.
-
I'm here madam chair.
-
There you are, councilor O'Malley.
-
Thank you.
-
Yeah, I will be brief because
I think it's important
-
to get to the public testimony,
-
but it seems to me that this sounds
-
like a very productive
government operations hearing.
-
It sounds to me that obviously
-
there's a lot of support across the board
-
and it looks as though the commissioners
-
asking for one more working session,
-
which seems to make sense
-
and his comments were very positive.
-
So I think this is a good thing.
-
So thank you to the advocates.
-
Thank you to my colleagues,
-
particularly the lead sponsors
-
and the commissioner for his
participation this afternoon.
-
And thank you for all the folks
-
whom we will hear from shortly.
-
That's all I've got.
-
Thank you, councilor Essaibi George.
-
Thank you madam chair.
-
And I do apologize for earlier,
-
taking a call and unmuting myself somehow.
-
I apologize for that.
-
I hope I didn't say anything inappropriate
-
or share any certain secrets.
-
Thank you to the advocates
-
and to the commissioner
for being here today.
-
And thank you to the advocates
-
who have met with me over
the last a few months ago,
-
over a period of a few months,
-
just really learned a lot
through our times together,
-
especially from the younger folks.
-
I am interested in some,
-
if there is any information
-
that the commissioner
is no longer with us.
-
So perhaps I'll just forward
this question to him, myself.
-
But my question is around
the FBI's evidence standards
-
around facial recognition technology.
-
I wonder if they've released that.
-
And then through the chair to the makers,
-
we had some questions
around section B and two,
-
and perhaps we can discuss
this in the working sessional,
-
or I'll send this to you ahead of time,
-
but it does talk about
section B part two reads,
-
"Nothing in B or one shall prohibit Boston
-
or any Boston official
from using evidence related
-
to the investigation of a specific crime
-
that may have been generated
-
from face surveillance systems."
-
So we're just wondering
what types of evidence
-
are we allowing from here
-
and what situations are
we trying to accommodate.
-
Just the specific question
that we came up with
-
as an office, looking through that.
-
So I'll share that with the
makers of this ordinance
-
to understand that a little bit better.
-
But if any of the panels have information
-
on the FBI standards regarding,
-
if they've shared their standards.
-
That is my question for this time.
-
Thank you ma'am chair.
-
I can answer that quickly
-
and then Joy may also have an answer.
-
But one of the issues here
-
is that police departments
across the country
-
have been using facial
recognition to identify people
-
in criminal investigations
for decades now actually,
-
and then not disclosing that information
-
to criminal defendants, which
is a due process violation.
-
And it's a serious problem
-
in terms of the integrity of
our criminal justice process
-
in the courts and defendant's
rights to a fair trial.
-
So the FBI standards
that you're referencing
-
one of the reasons, at least
as far as we understand it
-
and I don't work for
the police department,
-
but one of the reasons as
far as we understand it,
-
that police departments have
not been disclosing information
-
about these searches
to criminal defendants,
-
is that there is no nationally
agreed upon forensic standard
-
for the use of facial
recognition technology
-
in criminal investigations.
-
And so my understanding is,
-
police departments and prosecutors fear
-
that if they were to disclose to courts
-
and criminal defendants,
-
that this technology was
used to identify people
-
in specific investigations,
-
that those cases would basically
get thrown out of court
-
because the people who
performed the analysis
-
would not really be able to testify
-
to their training under any
agreed upon national standard
-
for evaluating facial recognition results
-
or using the technology more generally.
-
I can't answer your other
question about that exemption.
-
It is meant to address
basically wanted posters,
-
because for example, if the
FBI runs facial recognition
-
on an image of a bank robbery suspect
-
and then produces a wanted poster
-
that has that person's name on it
-
and shares with the
Boston Police Department,
-
obviously the BPD is not going
to be in a position to ask
-
every single police
department in the country,
-
where did you get the name
attached to this picture?
-
And so that's what that
exemption is meant to address.
-
There have been some concerns
-
from some of our allied organizations
-
that we need to tighten that up
-
with a little more extra language
-
to make clear that that
does not allow the BPD
-
to use the RMV system
for facial recognition.
-
And we will be suggesting some language
-
to the committee to that effect.
-
And just to echo a little bit
-
of what Kade is sharing here,
-
I do wanna point out that in April, 2019,
-
what the example of the
student being misidentified
-
as a terrorist suspect,
you had that photo shared.
-
And even though it was incorrect,
-
you have the ramifications of what happens
-
because of the presumption of guilt.
-
And so this is why I especially believe
-
we should also be talking about systems
-
that try to infer criminality
-
using AI systems in any kind of way.
-
Because again, the presumption of guilt
-
then adds to the confirmation bias
-
of getting something from a machine.
-
So, I think in the working session,
-
will there be improved language
-
around tightening that piece up.
-
Yes, we have some suggestions.
-
Yeah, thank you, thank you ma'am chir.
-
You're welcome, councilor Flynn.
-
Wasn't sure if you had any questions.
-
Thank you, thank you councilor Edwards.
-
I know I had a conversation awhile back
-
with Erik Berg from the teacher's
union about the subject,
-
and I learned a lot
-
and I learned a lot from
listening to the advocates
-
this afternoon.
-
My question is, I know Joy
covered it a little bit,
-
but besides for criminal investigations,
-
what are the reasons that the government
-
would want surveillance cameras.
-
Surveillance, cameras, councilor Flynn,
-
or facial recognition
technology specifically?
-
Both.
-
Well, my understanding
is government agencies,
-
including the Boston
Transportation Department
-
use surveillance cameras for
things like traffic analysis,
-
accident reconstruction.
-
So there are non-criminal
uses for surveillance cameras.
-
I'm not aware of any
non-criminal uses in government,
-
at least for facial facial surveillance,
-
except in the,
-
I would say so-called intelligence realm.
-
So that would mean not a
criminal investigation,
-
but rather an intelligence investigation,
-
for example of an activist
-
or an activist group or a
protest or something like that.
-
Okay.
-
Can besides government,
-
what are the purposes
of facial recognition
-
that businesses would want to know?
-
Then they might use it in a store,
-
but what are some of the other reasons
-
private sector might use them.
-
Sure, so speaking to private sector uses
-
of facial recognition,
-
oftentimes you see it
being used for security
-
or accessing securing access
to a particular place.
-
So only a particular
employees can come in.
-
Something that's more alarming
-
that we're seeing with commercial use,
-
is the use in housing.
-
And so we have a case even in Brooklyn,
-
where you had tenants
saying to the landlord,
-
we don't want to enter
our homes with our face,
-
don't install facial
recognition technology.
-
They actually won that case,
-
but not every group is
going to be so successful.
-
So you can be in a situation
-
where your face becomes the key.
-
But when your face is the key
and it gets stolen or hacked,
-
you can't just replace it so easily.
-
You might need some plastic surgery.
-
So we see facial recognition technologies
-
being used for access in certain ways.
-
And again, the dangerous use of trying
-
to predict something about somebody.
-
Are you going to be a good employee?
-
You have companies like Amazon,
-
saying we can detect fear from your face.
-
And so thinking about how
that might be used as well.
-
The other thing we have to consider
-
when we're talking about commercial uses
-
of facial recognition technologies,
-
is oftentimes you'll have companies
-
have general purpose facial recognition.
-
So this then means other people can buy it
-
and use it in all kinds of ways.
-
So from your Snapchat filter
-
to putting it for lethal
autonomous weapons.
-
You have a major range
-
with what can happen with
these sorts of technologies.
-
Pardon me councilor.
-
I would just jump into also reiterate
-
that this ordinance only
applies to government conduct.
-
It would not restrict in any way,
-
any entity in the private sector
from using this technology.
-
But that said, some other uses,
-
I don't know if you've ever
seen the film "Minority Report,"
-
but in that film, Tom
Cruise enters them mall
-
and some technology says to him,
-
"Hello, sir, how did you
like the size small underwear
-
you bought last time?"
-
So that's actually now
becoming a reality as well,
-
that in the commercial space, at stores,
-
for marketing purposes,
-
we are likely going to see
-
if laws do not stop this from happening,
-
the application of facial surveillance
-
in a marketing and commercial context
-
to basically try to get
us to buy more stuff.
-
And to underscore that point,
-
you have a patent from Facebook,
-
which basically Facebook has
so many of our face prints.
-
We've been training
their systems for years.
-
The patent says, given that
we have this information,
-
we can provide you background details
-
about somebody entering your store,
-
and even give them a
trust worthiness score
-
to restrict access to certain products.
-
This is a patent that has
been filed by Facebook.
-
So it's certainly within the realm
-
of what companies are exploring to do.
-
And you already have companies
-
that use facial recognition technologies
-
to assess demographics.
-
So you have cameras that
can be put into shelves.
-
You have cameras that can
be put into mannequins.
-
So you already have this going on
-
and Tacoma in Washington,
-
they even implemented a system
-
where you had to be face checked
-
before you could walk
into a convenience store.
-
So this technology is already out there.
-
Well thank you.
-
I know my time is up.
-
I'm looking forward to
continuing the conversation
-
'cause I have a couple more questions,
-
but I'll ask them at another time.
-
But again, thank you to the advocates
-
and thank you to councilor Edwards.
-
Thank you.
-
So it's between me and public testimony.
-
So I'm gonna just put my
three questions out there
-
and then we're gonna go
right down the list of folks
-
who have signed up in RSVP.
-
Again to those folks who are gonna come
-
after you've testified,
-
we were maxed out in our limit of people
-
who could participate.
-
So if you're willing to
testify and then also sign out
-
and watch through YouTube
-
or watch you and other mechanisms
-
to allow somebody else to testify,
-
that would be very
helpful to this process.
-
I'm looking specifically
at the issue of enforcement
-
in this statute or in this ordinance.
-
And it says no evidence derived there from
-
maybe received in evidence in proceeding
-
or before any department or
officer agency regulatory body.
-
I want to be clear if the
police received this evidence,
-
could a prosecutor use it in court.
-
So -
That's my question.
-
So that's my issue.
-
There seems to be no
actual evidence prohibition
-
for use in court against somebody.
-
The other issue or concern I have
-
is a provision violations
of this ordinance
-
by a city employee, shall
result in consequences
-
that may include retraining,
suspension, termination,
-
but they're all subject to provisions
-
of collective bargaining agreements.
-
So all of that could be gone away
-
if their union hasn't agreed
-
to that being part of
the disciplinary steps
-
for that particular employee.
-
So to me that signals the patrolmen
-
and other unions of folks
-
who are part of need to need
to have this as part of their,
-
a violation of contract
or violation of standard,
-
I think maybe I'm wrong.
-
And then finally, Joy, your testimony
-
about the private sector
has really hit me.
-
Thank you so much.
-
I'm particularly concerned about this.
-
I'm a twin.
-
Facebook tags my sister in all
of my pictures automatically.
-
I mean I'm a twin, councilor
Essabi George has triplets.
-
So for those, we're called
multiples, you're all singletons,
-
you were born by yourself.
-
We share birth, multiples have you know,
-
and so naturally I'm concerned
about this technology.
-
Not that my sister is inclined
to do any criminal activity,
-
but she may go to a protest
or two, I don't know,
-
but either way, the point is,
-
I'm concerned about the private
actor and how they interact.
-
And I'm particularly concerned
-
is how far this will go down the line,
-
the chain supply chain, right?
-
'Cause we have $600 million in contracts
-
for all sorts of things
for the city of Boston.
-
If we have a contract
with a cleaning company
-
that requires the workers to sign in
-
with facial recognition, right?
-
So how far down the line
can we go with our money?
-
Now I understand time is of the essence.
-
And that might be a big,
-
big question that we can work
out in the working session.
-
I'm particularly concerned
about the courts.
-
So if you wanna just focus
on that one, thank you.
-
Thank you, councilor.
-
So just very quickly,
-
we are unclear on what the
city council's power is
-
to control the prosecutor's office.
-
I think we'll look more into that,
-
we'll look closely at that.
-
On the second question, I agree.
-
We need to look at the BPA agreement,
-
which I understand expires the summer.
-
And then finally on the
private sector concerns,
-
I share them.
-
Again, we wanna get this government ban
-
passed as quickly as possible.
-
The ACLU supports approaches like that,
-
that the state of Illinois has taken.
-
They passed the nation's strongest
-
consumer facing biometrics privacy law.
-
It's called the Biometric
Information Privacy Act,
-
BIPA.
-
BIPA essentially prevents
private companies,
-
any private companies,
-
from collecting your biometric data
-
without your opt-in consent.
-
And that's not, I clicked a button
-
when I was scrolling
through a terms of service,
-
it's you actually have
to sign a piece of paper
-
and give it to them.
-
So we need a law like that
right here in Massachusetts.
-
Thank you.
-
So at this point,
-
I'm gonna turn it over
to public testimony.
-
I have some folks who
have already committed
-
and asked to speak today.
-
I'm gonna go through them.
-
I'm going to, it's
already quarter to five.
-
I'm going to try and end this
hearing no later than six.
-
So that's an hour and 15 minutes
-
to move through public testimony.
-
And I'm gonna try,
-
I'm gonna have to keep
folks to two minutes
-
as I said before,
-
because there are a lot
of people signed up.
-
So, of the folks who've signed up,
-
I will go through that list.
-
Then I'm gonna invite
people to raise their hands
-
who may not have signed up directly,
-
who would also like to testify.
-
All right, so I have on the
list, Bonnie Tenneriello
-
from the National Lawyers Guild.
-
After her I have Callan Bignoli
-
from the Library Freedom
Project and Maty Cropley.
-
Those are the three folks lined up.
-
Let's see.
-
Kaitlin, is there a Bonnie?
-
Okay well, I see Callan
right now ready to go.
-
So I'm gonna go ahead
and start with Callan
-
if you wanna start and
your two minutes has begun.
-
Sure, hi, I'm Callan Bignoli,
-
I am a resident of West Roxbury
-
and a librarian in the Boston area.
-
And I am speaking on behalf
-
of the Library Freedom Project Today.
-
So thank you to the chair
-
and thank you to all of city council
-
for the chance to testify today
-
in this important piece of legislation.
-
So, the Library Freedom Project
is a library advocacy group
-
that trains library workers to advocate
-
and educate their library community
-
about privacy and surveillance.
-
We include dozens of library
workers from the U.S.,
-
Canada and Mexico,
-
including eight librarians
from Massachusetts.
-
These library workers
educate their colleagues
-
and library users about surveillance
-
by creating library programs,
union initiatives, workshops,
-
and other resources inform
and agitate for change.
-
And facial recognition technology
-
represents a class of
technology that is antithetical
-
to the values of privacy
-
and confidentiality of library workers.
-
As library staff, we understand
that part of our work
-
is to represent these
values and the services
-
and resources we provide
to our library community
-
in order to reduce the harm of
state and corporate scrutiny.
-
As we understand this,
-
we can see the social
and economic imperatives
-
that privilege some and marginalize others
-
that are encoded into
many computer systems
-
and applications.
-
This is a result of the structure
-
of our technology industry,
-
which prioritizes the
interests of management,
-
venture capitalists and stockholders,
-
who are mostly white men.
-
The racial and gender biases
-
inherent in face surveillance technology
-
are indicative of those values
-
and they describe how moneyed
interests seek to shape
-
or reinforce racist and gender depressions
-
by creating computer systems
-
that extend the reach of profiling
-
through the exercise of capital.
-
Democratic direct worker
and community control
-
over the development acquisition
-
and practices of surveillance
and surveillance technology,
-
is an urgent priority to protect safety
-
and privacy and our
neighborhoods and schools.
-
So the Library Freedom Project
-
supports this urgently necessary ordinance
-
to ban facial recognition
technology in Boston
-
for the safety and privacy
-
of those who were working in
the city and who live here.
-
we urge the council to rest
control over surveillance -
-
Oh, two minutes is up.
Oops!
-
Two minutes is up, so
If you wanna summarize.
-
I have one, one more sentence.
-
We can not allow Boston
to adopt authoritarian,
-
unregulated and bias surveillance
technology, thank you.
-
Thank you very much, Bonnie,
-
I don't know if Bonnie's available,
-
if not I see Maty is available.
-
So I'm gonna go ahead and
start your two minutes, Maty.
-
Thank you very much.
-
Thank you councilor Edwards
-
and thank you to the city council
-
and all the panelists for
taking up this important issue.
-
My name is Maty Cropley,
I use they/them pronouns.
-
I'm a teen librarian and
a bargaining unit member
-
of the Boston Public Library
Professional Staff Association,
-
MSLA local 4928 AFT.
-
Our union of library workers
-
supports a ban on facial
recognition in Boston.
-
In voting to do so,
-
our union recognizes
that facial recognition
-
and other forms of biometric surveillance
-
are a threat to the civil liberties
-
and safety of our colleagues
and of library users.
-
Our library ethics of privacy
and intellectual freedom
-
are incompatible with
this invasive technology.
-
We recognize that facial recognition
-
and other biometric
surveillance technologies
-
are proven to be riddled with racist,
-
ableist and gendered algorithmic biases.
-
These systems routinely
misidentify people of color,
-
which can result in needless
contact with law enforcement
-
and other scrutiny, essentially
automating racial profiling.
-
We recognize the harm of
surveillance for youth in our city,
-
especially black, brown
and immigrant youth.
-
Unregulated scrutiny by authorities
-
leads to early contact
with law enforcement
-
resulting in disenfranchisement,
marginalized futures
-
and potential death by state violence.
-
We recognize that public
areas such as libraries,
-
parks, and sidewalks exist as spaces
-
in which people should be free to move,
-
speak, think inquire, perform, protest,
-
and assemble freely without
the intense scrutiny
-
of unregulated surveillance
by law enforcement.
-
Public spaces exist to extend our rights
-
and provide space for the performance
-
of our civil liberties, not policing.
-
A ban on face surveillance technology
-
is critically important for the residents
-
and visitors to the city of Boston.
-
Our union encourages you to ban
-
the use of face surveillance
in the city of Boston
-
by supporting and passing
this very crucial ordinance.
-
I'd like to thank you for your time.
-
And we are the Boston Public Library,
-
Professional Staff Association.
-
Thank you very much.
-
Thank you very much.
-
I understand that Ms. Bonnie is available.
-
She's under a name Linda Rydzewski,
-
and we're gonna pull her up.
-
I'll check when she's available.
-
After her will be the
following three testifiers,
-
Nikhill Thorat, Nour Sulalman
and professor Woodrow Hartzog.
-
First, Bonnie.
-
She's under Linda Rydzewski, sorry.
-
Linda Rydzewski
-
I am so sorry, I've had
a technical problems.
-
This is Bonnie Tenneriello,
-
on behalf of the National Lawyers Guild.
-
And I had to sign in
with my office account
-
and I am not representing
Prisoner's Legal Services
-
and I couldn't change the screen
name once I was signed in.
-
Am I audible to everyone?
-
You're on and you have two minutes.
-
Okay.
-
I'm speaking on behalf of
the National Lawyers Guild,
-
Massachusetts Chapter,
-
which for over 80 years as
fought for human rights,
-
and we're deeply
concerned over the dangers
-
of facial surveillance.
-
We strongly support this ordinance.
-
Facial recognition
deployed through cameras
-
throughout a city can be used
-
to track the movements of dissidents
-
and anyone attending a protest
as others have observed.
-
Our history and the National Lawyers Guild
-
shows that this danger is real.
-
We have defended political dissidents
-
targeted by law enforcement,
-
such as members of the
Black Panther Party,
-
American Indian Movement,
-
Puerto Rican Independence Movement
-
and we're recently here in Boston,
-
occupied Boston Climate Change activists,
-
those opposing a straight pride event
-
and Black Lives Matter.
-
We know that the dangers
of political surveillance
-
and political use of
this technology are real.
-
The National Lawyers Guild
-
also helped expose illegal surveillance
-
in the Church Commission
1975-76 COINTELPRO hearings.
-
We know from this experience
-
that this can be used to disrupt
-
and prosecute political protest.
-
People must be able to demonstrate
-
without fear of being identified
and tracked afterwards.
-
I will not repeat what others have said
-
very articulately about just
the basic privacy concerns
-
of being able to walk through a city,
-
without having cameras track your identity
-
and movements throughout the city,
-
nor will I repeat the excellent testimony
-
that was given about the
racial bias in this technology.
-
But on behalf of the
National Lawyers Guild,
-
I will say that we cannot
tolerate a technology
-
that perpetuates racism
in law enforcement,
-
as tens of thousands of people
-
take to the streets
calling for racial justice.
-
We're at a historical moment
when our society is demanding
-
and this council is
considering a shift away
-
from control and policing
of communities of color
-
and low-income communities
-
and towards community directed programs
-
that channel resources to
jobs, healthcare, education,
-
and other programs that
create true safety.
-
We reject facial surveillance
-
and we urge the council
to pass this ordinance.
-
Thank you very much.
-
So up next we have Nikhill Thorat.
-
Sorry, if I mispronounced your name.
-
No harm, you got that right.
-
So thank you chair and to the
amazing speakers before me.
-
My name is Nikhil, I work on
at Google Brain in Cambridge.
-
So I specifically work in the area
-
of machine learning fairness
and interpretability.
-
So this is sort of the stuff that we do.
-
So I'm here as a citizen,
-
and I am very much in
favor of the ordinance
-
banning facial recognition
by public officials
-
here in Boston.
-
So modern AI algorithms
are statistical machines.
-
They base predictions off
patterns and data sets
-
that they're trained on,
there's there's no magic here.
-
So any bias that stems from
how a data set is collected,
-
like non-representative collection of data
-
for different races
-
because of historical context,
-
gets automatically reflected
in the AI models predictions.
-
So what this means is that it
manifests as unequal errors
-
between subgroups, like race or gender.
-
And this is not theoretical,
-
it's been studied very deeply
-
by many AI researchers in this field
-
who have been on this call.
-
There's a federal NISD,
-
a National Institute of
Standards and Technology Study
-
that looks at almost 200
facial recognition algorithms
-
and found significant
demographic disparities
-
in most of them.
-
And there's another paper
by Joy here, Gender Shades,
-
that looked at three commercial
classification systems
-
and found that there are
-
35% of the time incorrect
results for dark-skinned females
-
versus only 0.8% incorrect
for light-skinned males.
-
This is, this is horrible, right?
-
So these mistakes can't
also be viewed in isolation.
-
They will perpetuate bias issues
-
by informing the collection
-
of the next generation of datasets,
-
which are then again used
-
to train the next generation of AI models.
-
Moreover, when mistakes are made either
-
by a machine or by a human being,
-
we need to be able to ask the question,
-
why was that mistake made?
-
And we simply haven't
developed robust tools
-
to be able to hold these algorithms
-
to an acceptable level of transparency,
-
to answer any of these critical questions.
-
Deployment of these systems
-
will exacerbate the issues of racial bias
-
and policing and accelerate
the various social inequalities
-
that we're protesting to
innocent people will be targeted.
-
The threshold for deploying these systems
-
should be extremely high,
-
one that's arguably not
reachable beyond accuracy,
-
beyond a hundred percent accuracy idea.
-
Data would have to be public
-
and there would have to be
actionable algorithmic oversight.
-
This is an incredibly high bar
and we are nowhere near that,
-
both algorithmically and societaly.
-
Along with this ordinance,
-
we must start laying the groundwork
-
for a rigorous criteria
for future technology
-
as it is an inevitable conversation.
-
But first we must pass the ordinance
-
banning facial recognition in Boston.
-
Thank you very much.
-
Thank you very much,
-
following Nikhill we have Nour Sulalman.
-
Sorry again for mispronunciation.
-
It's all good.
-
Thank you council members
-
and authors of the ordinance
for your service to the city.
-
My name is Noue Sulalman Langedorfer,
-
and I use she/her pronouns.
-
I'm a resident of Jamaica Plain,
-
and I'm here in my capacity
as a private citizen.
-
I won't repeat what everyone else said,
-
but I will say that the
use of facial recognition
-
by law enforcement today,
is especially disturbing
-
due to the epidemic of police
violence in this country
-
and the dysfunctional relationship it has
-
with black and brown people in the city.
-
And with all due respect,
-
I'm unconvinced by the distinction made
-
between facial surveillance
and facial recognition.
-
I'm even more disturbed or I am disturbed
-
that the commissioner does not know
-
if the police department uses
facial recognition software.
-
I asked the council to consider supporting
-
that the ordinance established a total ban
-
on the use of facial recognition software
-
in all public spaces,
including by law enforcement,
-
malls and shopping centers, retail spaces,
-
restaurants open to the public,
state government buildings,
-
courts, public schools,
schools funded by the state,
-
on public streets and
in places of employment,
-
as a condition of employment
-
and as a condition of landlords
leasing rental properties.
-
The ban should also apply to the use
-
and purchase a facial recognition software
-
by any state agency operating in Boston,
-
including law enforcement
-
and I further or to the council
to consider adopting a ban
-
on the use of private DNA
databases by law enforcement
-
and all remote surveillance systems,
-
including those that
recognize individual gates.
-
Information gathered through
the aforementioned technologies
-
should not be deemed
admissible in civil, criminal
-
or administrative proceedings
-
or serve as a basis for
granting a search warrant.
-
Thank you madam chair.
-
Thank you very much.
-
I have a professor Woodrow Hartzog.
-
Okay, two minutes.
-
Thank you so much, dear councilors,
-
thank you for allowing me the opportunity
-
to speak in support of the ordinance.
-
Banning face surveillance in Boston.
-
I am a professor of law
and computer science
-
at Northeastern University,
-
who has been researching and writing
-
about the risks of facial
recognition technologies
-
for over seven years,
-
I make these comments in my
personal academic capacity.
-
I'm not serving as an advocate
-
for any particular organization.
-
Please allow me to be direct,
-
facial recognition technology
-
is the most dangerous
surveillance tool ever invented.
-
It poses substantial
threats to civil liberties,
-
privacy and democratic accountability.
-
Quite simply the world has
never seen anything like it.
-
Traditional legal rules, such
as requiring legal process
-
or people's consent before
surveillance could be conducted,
-
will only entrench these systems,
-
lead to a more watched society
-
where we are all treated
as suspects all the time.
-
Anything less than a ban
-
will inevitably lead to unacceptable abuse
-
of the massive power
bestowed by these systems.
-
That is why I believe that this ordinance
-
is justified and necessary.
-
There are many ways that law enforcement's
-
use of facial recognition
technology can harm people.
-
Government surveillance
-
has disproportionately targeted
marginalized communities,
-
specifically people of color,
-
facial recognition will
only make this worse
-
because it is inaccurate and bias
-
based along racial and gender lines.
-
And while facial
recognition is unacceptable
-
and it's biased and error prone,
-
the more accurate it becomes,
-
the more oppressive it will get.
-
Accurate systems will be more
heavily used and invested in
-
further endangering the people of Boston.
-
Use of facial recognition
seems likely to create
-
a pervasive atmosphere of chill.
-
These tools make it easier
to engage in surveillance,
-
which means the more
surveillance can and will occur.
-
The mere prospect of a
hyper surveil society
-
could routinely prevent citizens
-
from engaging in First
Amendment protected activities,
-
such as protesting and worshiping,
-
for fear of ending up on
government watch lists.
-
Facial recognition also poses a threat
-
to our ideals of due process,
-
because it makes it all too easy
-
for governments to excessively
enforce minor infractions
-
as pretext for secretly monitoring
-
and retaliating against our citizens
-
who are often targeted for
speaking up like journalists,
-
whistleblowers and activists.
-
The net result could be
anxious and oppressed citizens
-
who were denied fundamental
opportunities and rights.
-
For the reasons outlined above,
-
I strongly support the ordinance
-
banning facial recognition
surveillance in Boston.
-
It's the best approach for preventing
-
an Orwellian future
-
and ensuring that the city of Boston
-
remain a place where people can flourish
-
and civil liberties from a protected.
-
Thank you so much, professor.
-
We have coming up, Alex
Marthews, Emily Reif,
-
Jurell Laronal.
-
Those three folks, I don't
know if they're ready to go.
-
But we'll start, I see Jurell, I see Alex.
-
Go ahead Alex, Marthews.
-
Hi.
-
I don't wish to echo what
other excellent advocates
-
have been saying,
-
but I am here representing Digital Fourth,
-
restored fourth Boston,
-
where she is a volunteer based
-
advocacy group on surveillance
and policing issues
-
in the Boston area.
-
I wanna highlight some
comments of commissioner Gross,
-
in addition to our testimony.
-
Commissioner Gross seems to
recognize the racial biases
-
and limitations of this
technology as it stands now,
-
but he says it's improving
and he holds out hope
-
that it will be more
accurate in the future.
-
And that perhaps at some future point,
-
the Boston City Council should reconsider
-
whether it will be appropriate
-
to implement facial
recognition technology.
-
The problem with this is that
-
a 100% accurate facial
recognition technology system
-
would be terrifying.
-
And it would represent the
death of anonymity in public.
-
In terms of the effect of
facial recognition technology
-
on legislative terrorists themselves,
-
there have been a number of studies now
-
that show that current facial
recognition technologies
-
are perfectly capable
-
of misidentifying legislators as criminals
-
and these systems being 100% accurate
-
will pose even more
worrying risks about that.
-
As city councilors,
-
where are you going when
you're going in public,
-
are you meeting with
social justice groups?
-
Are you meeting with immigrant advocates?
-
Are you meeting with God forbid,
-
police reform and down to
kit accountability groups?
-
Are you doing something
embarrassing perhaps
-
that could be used to influence your votes
-
on matters of public interest
-
or matters relating to the police budget?
-
The risk of these things
happening are very real,
-
if it is to be implemented in the future.
-
And the only appropriate response
-
is for today's city council
to ban this technology.
-
Thank you very much.
-
I believe, I don't know
if Emily is available,
-
but I do see Jurell, Jurell
would you like to go?
-
Thank you.
-
Thank you, Boston City Council
for having this platform.
-
I wanna thank you.
-
Thank you and a shout out
to the expert advocates
-
on the subject for shedding
light on the subject
-
that me and many of us
are not fully versed in.
-
What I am well versed in it though,
-
is the reality surrounding
-
the Boston Police Department's history
-
and the negative impact that they have had
-
on the black or brown
Bostonian's for decades.
-
My name is Jurell Laronal,
-
I'm a formerly incarcerated black man
-
from a heavily police commerce
neighborhood of Dorchester.
-
I'm a community organizer for
Families for Justice as Halen
-
and I'm here to support the ordinance
-
banning facial recognition
technology in Boston
-
presented by councilors Wu and Arroyo.
-
First, what are we talking about?
-
We are talking about a force whose actions
-
and racial profiling techniques
have been documented,
-
tried in court and notoriously
known for targeting
-
and terrorizing those from
my disinvested communities.
-
We are talking about
militarization to intimidate,
-
force their well on
black and brown citizens
-
of the Commonwealth.
-
A force by their own field interrogations
-
and observation records,
-
show almost 70% of them
were black residents
-
in a city that's only around 25.2% black.
-
So we're talking about
taking the same force
-
and further funding them
-
and equipping them with another weapon
-
that would definitely be used
to further oppress, target
-
and incarcerate our sons and fathers
-
and daughters and mothers.
-
My organization is currently
supporting the appeals
-
and release efforts of black
men and their families,
-
who have been incarcerated
since the seventies
-
due to a racist profiling
and racist misidentification.
-
When I think about racial
recognition software,
-
which at best is predatory and inaccurate,
-
I don't see a solution
and I don't see safety.
-
I see yet another form of racial bias
-
and another way of police
-
to cause harm in my community.
-
Today our city's historical
racial profiling issue
-
with, somebody said it earlier,
-
with the federal government study
-
that the algorithm, my fault,
-
failed to identify people of color
-
and children and elderly and women.
-
I see that it needs to be banned
-
because we can't continue
condoning and reassuring
-
and fueling other generations
of black and brown
-
men and women from Dorchester
Roxbury and Mattapan,
-
to more trauma and more incarceration.
-
Investment in our communities
-
through community led processes,
-
is the only thing that we need,
-
not facial recognition tech.
-
Oh these bees are messing
with me, sorry about that.
-
No matter what level that
technology reaches and thank you.
-
That's all I have to say.
-
I'm gonna keep it quick.
-
Thank you very much.
-
I really do appreciate it.
-
And I think, again speak from the heart
-
about their own experience
-
and specifically talk about
how it would impact them,
-
I do appreciate that.
-
Not that I don't appreciate the experts
-
but I'm telling you there's nothing better
-
than hearing about how
residents are directly impacted.
-
Thank you.
-
I have also on the list, Cynthia Pades,
-
I don't have a name,
-
but I have an organization
called For the People Boston.
-
I don't think we have either one.
-
There is.
-
Oh, I see a Marco, Marco Staminivich.
-
Marco, ready to go?
-
Marco!
-
Sure.
Okay.
-
How is it going everyone.
-
Thank you to the council and the panelists
-
for allowing me to speak.
-
So I'm a machine learning engineer
-
working in an adjacent field
-
and a resident of Jamaica Plain
-
and I'm here today as a private citizen.
-
So this is my first time
speaking at a public hearing,
-
so cut me some slack.
-
Although I'm not an expert
-
on service on the surveillance
aspect and the applications,
-
I am familiar with the
technology aspect of this stuff.
-
So I felt the need to testify today
-
because of the gravity of this issue.
-
Although the technology may
seem benign upon first glance,
-
this is an extremely
powerful and dangerous tool.
-
So to me this amounts to basically
-
a form of digital stop and frisk,
-
and it's even more dangerous
-
since it can be kind
of constantly running,
-
constantly working anywhere at all times,
-
kind of tracking folks.
-
This seems like a clear violation
-
of the Fourth Amendment to me.
-
As many commenters and
panelists today have noted,
-
in addition, as many
folks have noted today,
-
the technology is certainly not perfect
-
and it's riddled with
biases based on the data
-
that it was trained on
and how it was trained
-
and by the folks that trained it.
-
And although this is an
important thing to note,
-
this would really be a moot point
-
when evaluating these
technologies for deployment.
-
Because while a poorly
functioning system is bad
-
and introduces issues,
-
a perfectly functioning system
would be much, much worse
-
and would basically signal a world
-
where we could be tracked out of hand
-
at any time that police
or government wanted.
-
So that's really what I wanted to say.
-
And I wanted to kind of state my support
-
for a ban on any type of
facial recognition technology
-
for use by the police or the government.
-
And that we should
disincentivize the research
-
and development of, send clear signal
-
to disincentivize the
research and development
-
of these types of systems in the future.
-
Thank you.
-
Thank you, it was perfect on time,
-
for the first time participating.
-
We really appreciate it.
-
I hope that this is not your last time.
-
We are dealing with a
lot of different issues.
-
Your voice is definitely
welcome, thank you so much.
-
Up next I have,
-
I'm gonna name Will Luckman
followed by Lizette Medina,
-
and then also a Nathan
Sheard, I apologize again,
-
if I mispronounce your name,
-
feel free to just state your name again.
-
For folks who have already testified,
-
again, since we're over capacity,
-
if you sign out and then watch on YouTube
-
that allows for someone else
to get in line to also speak.
-
So just letting you know that
-
that's what we're trying to do.
-
So I will now turn it over.
-
I believe Will's ready to go.
-
And you have two minutes Will.
-
Thank you. good afternoon.
-
My name is Will Luckman
and I serve as an organizer
-
with the Surveillance Technology
Oversight Project or STOP
-
and STOP advocates and litigate
-
to fight discriminatory surveillance.
-
Thank you councilors for
holding this hearing today
-
and for proposing this crucial reform.
-
Today, my oral remarks are an
excerpt of written testimony
-
being entered into the record.
-
Facial recognition is biased,
broken, and when it works
-
antithetical to a democratic society.
-
And crucially, without this ban,
-
more people of color will be
wrongly stopped by the police
-
at a moment when the
dangers of police encounters
-
have never been clearer.
-
The technology that
drives facial recognition
-
is far more subjective than many realize.
-
Artificial intelligence is the aggregation
-
of countless human decisions
codified into algorithms.
-
And as a result, human
bias can affect AI systems,
-
including those that
supposedly recognize faces
-
in countless ways.
-
For example, if a security camera learns
-
you are a suspicious looking,
using pictures of inmates,
-
the photos will teach the AI
-
to replicate the mass incarceration
of African-American men.
-
In this way, AI can
learn to be just like us
-
exacerbating structural discrimination
-
against marginalized communities.
-
As we've heard,
-
even if facial recognition
worked without errors,
-
even if it had no bias,
-
the technology would
still remain antithetical
-
to everything the city
of Boston believes in.
-
Facial recognition
manufacturers are trying
-
to create a system that
allows everyone to be tracked
-
at every moment in perpetuity.
-
Go to a protest, he system knows,
-
go to a health facility,
it keeps a record.
-
Suddenly Bostonians lose
the freedom of movement
-
that is essential to an open society.
-
If the city fails to act soon,
-
it will only become
harder to enact reforms.
-
Companies are pressuring local
state and federal agencies
-
to adopt facial recognition tools.
-
BriefCam, the software
powering Boston's surveillance
-
camera network, has released a
new version of their software
-
that would easily integrate
-
invasive facial recognition tools.
-
Although the commissioner
-
says he will reevaluate
the new version on offer.
-
He also expressed interest in waiting
-
until the accuracy of
the technology improves
-
or making exceptions for facial
recognition use right now.
-
To definitively and
permanently avoid the issues
-
with facial recognition
that I've raised here,
-
we need comprehensive ban now.
-
And I'll conclude on a personal note.
-
I live and work in New York City,
-
but I was born in Boston
and raised in Brooklyn,.
-
it pains me to see the
current wave of protests
-
roiling the area,
-
because it demonstrates the bias
-
and unequal law enforcement practices
-
I remember from my youth,
have yet to be addressed.
-
I know that the people of the Commonwealth
-
want to see a change,
-
and I believe that the
council is on their side.
-
In practice, inaccuracies aside,
-
facial recognition systems
lead to increased stops
-
for people of color, increased stops means
-
an increase in opportunity
for police violence and abuse.
-
We must recognize that Black Lives Matter
-
and to do so we must realize
-
that technology doesn't
operate in a neutral vacuum.
-
Instead it takes on the character
-
of those building and deploying it.
-
I encouraged the council to respond
-
to their constituents
demands for police reform
-
by immediately banning the use
-
of this harmful technology in Boston.
-
Thank you.
Thank you very much Will.
-
Lizet Medina.
-
Hi, good afternoon.
-
My name is Lizet Medina, I'm
here as a private citizen.
-
I actually am pretty new to all of this.
-
This week I've found out
-
from the Student Immigration Movement,
-
kind of what's going on.
-
And I just wanted to voice
my support for this ban.
-
I think for personally, as an immigrant,
-
and speaking toward like the presence
-
of facial recognition
technology in the school system,
-
I think especially where we
have so many conversations
-
with like the school to prison pipeline.
-
This tool could be used to facilitate
-
that kind of work in racially profiling
-
and profile of our students.
-
And I think now more than ever,
-
we have this point in
time a decision to make
-
in protecting our residents of color,
-
black and brown residents and students
-
as they live in and try to
navigate the various systems
-
that are already putting pressure on them.
-
And so I think, especially
in light of recent events,
-
this is really a stand to support
-
residents of color in Boston,
-
over possible pros that
could be looked at in this.
-
And I also think it's troubling
-
that the commissioner doesn't seem
-
to be clear about What's
kind of going on with this.
-
In addition that it also isn't clear
-
that there is even a
surveillance of whether or not,
-
this could be used already,
that kind of thing.
-
So I think one, I'm just so thankful
-
that this conversation is being had,
-
that this is open to the public
-
and that I just wanted to voice my concern
-
for this being a tool for racial profiling
-
and my support in the ban.
-
Thank you.
-
Very much.
-
Next is Nathan Sheard.
-
Again I apologize if I
mispronounced your name.
-
You have two minutes Nathan, go ahead.
-
No, you said it perfectly, thank you.
-
So my name is Nathan Sheard
-
and thank you for allowing me to speak
-
on behalf of the Electronic
Frontier Foundation
-
and our 30,000 members.
-
The Electronic Frontier Foundation,
-
strongly supports legislation
-
that bans government
agencies and employees
-
from using face surveillance technology.
-
We thank the sponsors of this ordinance
-
for their attention to
this critical issue.
-
Face surveillance is profoundly
dangerous for many reasons.
-
First, in tracking our faces,
-
a unique marker that we can not change,
-
it invades our privacy.
-
Second, government use of the
technology in public places
-
will chill people from engaging
-
First Amendment protected activity.
-
Research shows and courts
have long recognized
-
that government surveillance
-
and the First Amendment
activity has a deterrent effect.
-
Third surveillance
technologies have an unfair,
-
disparate impact against
people of color, immigrants
-
and other vulnerable populations.
-
Watch lists are frequently over-inclusive,
-
error riddled and used in conjunction
-
with powerful mathematical algorithms,
-
which often amplify bias.
-
Thus space surveillance is so dangerous
-
that governments must not use it at all.
-
We support the aims of this ordinance
-
and respectfully seek three amendments
-
to show that the bill will appropriately
-
protect the rights of Boston residents.
-
First, the bill has an
exemption for evidence
-
generated by face of bills
-
that relates to investigation of crime.
-
If that respectfully ask that'd be amended
-
to prohibit the use of face
surveillance enabled evidence
-
generated by or at the request
-
of the Boston Police Department
or any Boston official.
-
Second, the private right of action
-
does not provide the shipping
for a prevailing plaintiff.
-
Without such fee shipping,
-
the only private enforcers
will be advocacy organizations
-
and wealthy individuals.
-
You have to be eff respectfully
ask that the language
-
be included toward
reasonable attorney fees
-
to a prevailing plaintiff.
-
Third, the ban extends to private sector
-
use of face surveillance conducted
with a government permit.
-
The better approach as
Kade has offered earlier,
-
is to use face surveillance is used only
-
with informed opt-in consent.
-
Thus you have to respectfully requests
-
that the language be limited
-
to prohibiting private sector permitting
-
for the use of face
surveillance recognition
-
at the government's request.
-
And closing EFF, once again
thanks to the sponsors.
-
Look forward to seeing an ordinance pass
-
that bans government use of
face surveillance in Boston
-
and will enthusiastically
support the ordinance banning
-
face recognition technology in Boston,
-
if it is amended in these ways, thank you.
-
Nathan, I just wanna make sure I notes
-
on your suggestions.
-
And I wanna make sure you had three,
-
which is the ban that used by the BPB,
-
no exemptions, period.
-
Two, that currently because [indistinct]
-
doesn't allow for fee
shifting our attorney fees,
-
it really does put it
on either individuals
-
to take it on their own
without having any kind of,
-
I guess partner care or
support that helps be funded.
-
And then three to extend
to private actors,
-
essentially down the supply chain
-
where the city of Boston
can tell private actors
-
that contract with us.
-
Because one of the reasons
I am always saying this,
-
one of the issues is how far we can go.
-
I mean, we can't regulate Facebook,
-
but we can regulate where money goes.
-
So I'm assuming spot that.
-
Yeah, so just to reiterate,
-
so the First Amendment
that we would suggest
-
is that in right now,
-
there's an exclusion
for forever for evidence
-
and Kade spoke earlier
-
to the fact that that's in reference to,
-
if they receive something
from the federal government
-
or one to post or what have you.
-
But to make sure that it's amended
-
to tighten that up a bit and say that,
-
as long as that
information isn't requested
-
or created at the request
of the Boston police forums,
-
that's the first one.
-
The second one is the fee shipping.
-
So that it's not just the EFF and the ACLU
-
and wealthy individuals
that have the power
-
to make sure that it's enforced.
-
So folks can find
attorneys to support them
-
because the attorneys will
be compensated for that time.
-
And finally, on the third.
-
And I think that this speaks
to your greater question,
-
right now it's it prohibits
the offering of a permit
-
to a private company, agency
person, what have you,
-
whereas we would tightened up to say that
-
that a permit could be provided
-
as long it wasn't being provided
-
for face surveillance to
be collected at the behest
-
or at the request of the
Boston Police Department.
-
So really making sure
-
that we're speaking back to the fact
-
that like private use
would be better protected
-
by something similar to
-
the Biometric Information
Privacy Act in Illinois
-
that requires; informed,
opt-in and concept
-
from the individual,
-
rather than simply saying
-
that no private doctor can do space.
-
Thank you.
-
That was a great clarification.
-
I'm gonna go now to our next speakers.
-
Thank you very much.
-
I have a Sabrina Barosso,
Elmeda and Christina Vasquez.
-
So if we have.
-
Hi folks, thank you so much.
-
Just to let folks know,
Christina is not with us.
-
Some folks had to leave
because they have to work,
-
but so hi everyone, hi city councilors.
-
My name is Sabrina Barroso.
-
I am the lead organizer for
the Student Immigrant Movement.
-
The first undocumented
immigrant youth led organization
-
in the state of Massachusetts.
-
We believe that all
young people are powerful
-
and that when you are
supported and have space
-
and investment to grow,
-
they flourished into the
leaders of today and tomorrow.
-
At SIM, we are invested in protecting
-
and working on behalf of
the undocumented youth
-
and for families of Massachusetts.
-
That's why I'm here with you all today
-
and I would like to make clear
-
that we must prohibit the
use of facial recognition,
-
technology and software
in the city of Boston.
-
For far too long the
Boston Police Department
-
has been allowed to have full flexibility
-
and freewill over how the police
and surveil our communities
-
and the consequences are devastating.
-
Our people are being criminalized
and funneled into prisons,
-
detention and deportation.
-
We need full community
control over the police.
-
And this is a key move to
bringing power to our people.
-
Right now, there are no laws
-
that control what the BPD
can purchase for surveillance
-
or how they use it and
when I learned this,
-
I was scared because I
thought about the history
-
between the BPD and families like mine.
-
And I thought about the people that I love
-
and that the people I care about.
-
And I thought about the people
who are constantly harassed
-
and targeted by law enforcement
simply for existing.
-
So if you ask me,
-
facial recognition is not to be trusted
-
in the hands of law enforcement,
-
that tracks, monitors, hyper surveils,
-
black and brown communities,
-
and that puts immigrants and activists,
-
youth under constant
scrutiny and criminalization.
-
Councilors, would you really
trust a police department
-
that shares information to I.C.E?
-
That's share student
information to I.C.E as well.
-
That has led to the
deportation of students
-
that exchange emails saying
happy hunting with I.C.E.
-
And that has historically targeted
-
black and brown youth to
invade their personal space,
-
their bodies and their
privacy with stop and frisk.
-
There is not a shame to separate youth
-
from their and their families
from their communities
-
and then end up pinning,
injustices and crimes on us.
-
The BPD says that they don't
use spatial surveillance now,
-
but they have access to it.
-
And they have a $414
million budget to buy it.
-
Right now, our communities need access
-
to funding for healthcare,
housing and so many other things
-
and the list goes on.
-
For years, residents of
Boston have been demanding
-
to fund things that truly matter to them
-
and we have to invest in things
-
that are not face surveillance
that doesn't even work
-
and is irrelevant to
keeping our community safe.
-
We must invest in intentional
restorative justice practices,
-
healthcare, mental health and resources
-
to prevent violence and distress.
-
Facial surveillance would
end privacy as we know it
-
and completely throw off the balance
-
and power between people
and their government.
-
Listen to youth and
listening to our community,
-
and let's follow the lead of young people
-
who are addressing the deeper issue
-
of what is to criminalize our people
-
and putting the power in
the hands of the people,
-
especially when it comes
to governing surveillance.
-
Thank you all so much.
-
Thank you very much
-
and it'S like Christina's
not available, Sabrina.
-
Yeah, she's not available.
-
Okay, Ismleda.
-
I'm Ismleda and a member of
the City Media Movements.
-
Today I'm here to testify in support
-
of ban in facial recognition
technology in Boston.
-
This will establish a
ban of government use
-
of face surveillance
in the city of Boston.
-
Boston must pass this
ordinance to join Springfield,
-
Somerville, Cambridge,
Brooklyn and North Hampton
-
in protecting racial justice
and freedom of speech.
-
The feather of study
published in December 2019,
-
found that face recognition algorithms
-
are much more likely to fail
-
when attempted to identify
the faces of people of color,
-
children, the elderly and women.
-
That means that technology
only reliably works
-
on middle aged white men.
-
It very small fraction
of busters precedents.
-
As a citizen of Boston,
-
it is clear to me that
this type of technology
-
is not okay to use in our communities.
-
The fact that this technology
-
is used to control specifically
in communities of color,
-
makes my family and me feel unsafe,
-
as well as the other families
-
that live within this communities.
-
Black and brown kids
are put into databases
-
that are used against them
without redness or reason.
-
This interrupts our education
and limits our future.
-
Kids shouldn't be criminalized
or feeling unsafe.
-
Black and brown kids should
be able to walk on the street
-
without the fear of knowing
what's going to happen.
-
And it shows how this surveillance
-
is used to target black
and brown communities.
-
The racial profiling in technology
-
is one of the many forms
of systematic racism.
-
Immigrant youth in this
city do not trust the police
-
because we know that they
are very close with I.C.E.
-
We know that they sent a
student information to I.C.E,
-
and this has led to the
deportation of junk people
-
and pains our communities.
-
[indistinct]
-
BPD, cameras with surveillance
and criminalization of us.
-
It's not right.
-
Everyone on the council at some point
-
has talked about supporting young people
-
and young people being our future leaders.
-
We are leaders right now,
-
we are telling you to do
what everyone knows is right.
-
By preventing the use of facial
recognition surveillance,
-
we can do more to protect
immigrant families in Boston.
-
Stop selling youth and families out.
-
Let's give power to people
-
and take this first step in
addressing the problem we have
-
with law enforcement surveillance
-
and criminalization or people
by surveillance system.
-
Thank you very much
-
Next up, the next group
of folks we have are,
-
Sherlandy Pardieu, Eli Harmon,
Natalie Diaz and Jose Gomez.
-
So is Sherlandy available.
-
I see that person in
the waiting room, sorry.
-
well, we'll go ahead and go to Eli.
-
I think I see right now, ready to go
-
and then we'll come back to Sherlandy,
-
Eli you have two minutes.
-
Oh, thank you so much.
-
Oh, who's speaking?
-
oh, sorry, this is Eli.
-
Okay, very well.
-
Thank you so much.
-
My name is Eli Harmon,
I live in Mission Hill.
-
I've been doing work with the
Student Immigrant Movement.
-
I'm testifying on behalf of them today,
-
in support of the ordinance.
-
The issue of police surveillance
-
is enormously important to people
-
all across the city of Boston
-
though it's of particular
concern to communities of color,
-
which it overwhelmingly targets.
-
A study done by MIT research
-
showed this type of
surveillance technology
-
would be far less accurate
for people of color.
-
And in fact, it was inaccurate
for 35% of black women
-
and was most accurate for white adult men.
-
Law enforcement today, already of course,
-
overwhelmingly targets
black and brown communities.
-
And this type of technology
-
is likely to only make
this more severe case.
-
Additionally this type of technology
-
is a complete violation
of people's privacy.
-
There are no regulations with regard
-
to the extent to which law enforcement
-
can use this technology.
-
And it has been used as an abuse of power
-
by government officials in
order to track down immigrants,
-
people of color and activists.
-
Finally, lots of money goes into funding
-
this surveillance technology,
-
and for the reasons previously stated
-
this technology in no way
keeps our communities safe
-
and the money that currently funds it,
-
could be put into much better places,
-
such as creating
affordable housing for all,
-
fully funding the Boston public schools,
-
creating equitable healthcare
-
and other places which
would actually be effective
-
in bringing down crime rates.
-
Many cities that neighbor
Boston, including Somerville,
-
Brooklyn, Cambridge and Springfield,
-
have chosen to ban facial surveillance
-
because they understand
that it is dangerous
-
and harmful to communities of color.
-
It is a violation of people's privacy
-
and that money currently being
used to fund that technology
-
would be better spent in places
-
that actually improve lives in Boston.
-
For these reasons,
-
I asked the council to pass a prohibition
-
of the use of facial
recognition surveillance
-
and give the community control
-
over surveillance technology and usage.
-
Thank you so much.
-
Thank you very much.
-
I do see Sherlandy
-
I'm here.
-
Okay, you have two minutes Sherlandy.
-
I am writing on behalf of SIM
in support of the ordinance
-
banning facial recognition
technology in Boston.
-
I urge my city of Boston
to pass this ordinance
-
to join other cities in Massachusetts,
-
like Springfield, Somerville,
Cambridge, Brooklyn
-
and Northern and putting
the community power
-
before the police.
-
The city of Boston,
the residents, families
-
and youth from Boston,
deserve transparency,
-
respect and control.
-
A ban on facial surveillance technology
-
is important in the city of Boston
-
because this technology is dangerous
-
and it leads to serious
mistrust in the community.
-
Face surveillance is
proven to be less accurate
-
for people of darker skin.
-
Black people already face extreme levels
-
of police violence and harassment.
-
The racial bias and face surveillance
-
will put lives in danger.
-
But even if accurate,
-
we should still be doing all that we can
-
to avoid a future where every
time we go to the doctor,
-
a place of worship or protest,
-
the government is scanning our face
-
and recording our movements.
-
We already know that law enforcement
-
is constantly sharing
and misusing information.
-
What would happen to our people
-
when we are already under
constant surveillance
-
and criminalization.
-
Using face surveillance,
-
the government could begin to identify
-
and keep a list of every
person at a protests.
-
People who are driving
-
or who are walking the
street every single day
-
across the city on a
ongoing automatic basis,
-
it would keep the same
harmful surveillance going,
-
like what happened with the
use of [indistinct] database.
-
It would criminalize anyone
to make a move in our city.
-
If we want this city to be safe,
-
we need to put our
money where our mouth is
-
and invest in practice
such as restorative justice
-
to help restore or community.
-
Facial surveillance is not the answer.
-
If anything, it is distracting
-
or addressing our real concern
-
about safety in our communities.
-
I urge you to prohibit the
use of facial surveillance
-
in the city of Boston,
-
because we cannot allow law
enforcement to target black
-
and brown immigrants, activists
and community members.
-
Thank you so much for your time.
-
Thank you very much.
-
I want to say thank
you to all of the youth
-
who have testified and
will testify at SIM,
-
all those folks showing up, you are...
-
I just wanna say thank you so much
-
on behalf of the city
council, you're our future
-
and you're doing an amazing job.
-
I have a Natalie Diaz and then Jose Gomez.
-
I see the Emily Reif
who had signed up before
-
might also be available.
-
So those next three names,
-
I think Natalie may no longer be with us.
-
Okay, is Jose available?
-
Yes.
CHAIR: Two minutes.
-
Hi everyone.
-
I'm here on behalf of
SIM to testify in favor
-
of the ban on facial
recognition software in Boston.
-
I am an MIT alumni,
-
working as a software engineer in Boston
-
and I'm a DACA recipient as well.
-
And I wanted to tell you for your time
-
that as a software engineer working in AI
-
and machine learning, enabled robots,
-
that the software use
for facial recognition
-
is far from perfect.
-
In fact, as Joy found
in others have noted,
-
one in three black women is likely
-
to be misclassified by technology.
-
And our own federal government has found
-
that the face recognition algorithms
-
were more likely to fail
-
when attempting to identify
faces of people of color,
-
children, elderly and women.
-
This software after all
is written by people.
-
And this is inherently filled
with mistakes and bias,
-
despite the detailed
quality assurance processes
-
or production level software undergoes.
-
This is an accepted fact
in software industry,
-
no matter how good your software is
-
or the testing process is,
-
bugs in the software will always exist.
-
Consequently, the use of
facial recognition technology
-
by government entities
especially the police departments
-
must be banned as a consequence of errors
-
in the technology or if life and death.
-
The false identification by
a facial recognition software
-
could lead to an unwarranted confrontation
-
with the police department.
-
The seemingly small error in
facial recognition software
-
may not mean much to you,
-
but as an undocumented
immigrant and a person of color,
-
any confrontation with the police
-
could mean deportation or even my life.
-
The police have a record
of systemic racial bias
-
leading into the deaths
of thousands of people.
-
Government entities, special police forces
-
do not need any more tools
to further their racial bias
-
and consequently there's
a stomach murdering
-
of the very people
they're meant to protect.
-
I encourage you to press pause
-
on these face surveillance
by government entities
-
in the city of Boston, by
supporting this crucial ban.
-
Thank you very much, after
Jose I think we had...
-
After Jose we had Emily,
Emily Reif has rejoined us.
-
She was part of the original
first set we called, Emily.
-
EMILY: Hello, sorry.
-
Can you hear me?
Yep, two minutes.
-
Sorry about that, I was
having internet issues.
-
So my name is Emily Reif
-
and I work at a machine
learning research at Google.
-
Although of course my views are my own
-
and I'm here as a citizen.
-
Yeah, I strongly support the ban
-
on facial recognition technology
-
and many people have said similar things
-
to what I'm about to say
and much better words.
-
I just wanted to reiterate on...
-
Even when working correctly,
-
there are so many major privacy
-
with these technologies and by definition,
-
they're designed to track
us wherever and whenever
-
we're in public spaces and
to aggregate this information
-
and having those kinds
of extreme surveillance
-
is not only kind of horrifying
at a personal level,
-
but it will also fundamentally change
-
the way that we go about our daily lives
-
and operate as a democratic society.
-
And others have already cited
some of the research on this.
-
It's a well-documented
phenomenon, not a good one.
-
And so that's all about
facial recognition technology
-
when it's working perfectly.
-
But again, as people have noted
-
that's so far from the truth.
-
Yeah, there's a ton of
research on this by the ACLU,
-
and Joy who spoke earlier
-
and I've seen this in my own research.
-
Our team's research is
related to these areas.
-
That these models are just
totally not reliable for data
-
that is different than
what they're trained on.
-
And that these inaccuracy
don't make it any less,
-
I mean, one might say
like, oh, so it makes it,
-
potentially less dangerous,
but it makes it much,
-
much more dangerous when the inaccuracy
-
has just reinforced the biases,
-
that disparities that
have always been there
-
in the way that different
groups are surveyed
-
and policed and incarcerated.
-
And we saw with, with
Briana Hiller's death
-
that thinking that you're attacking
-
or that you're entering the right house,
-
and if you are wrong about that,
-
there's just horrible, an
unspeakable consequences
-
And yeah, we can't possibly risk that
-
and that's not a road we should remotely
-
think about going down, thank you.
-
That's perfectly on time.
-
I'm just gonna go down the list
-
and also call on Christian
De Leon, Clara Ruis,
-
Diana Serrano and Oliver De Leon.
-
Are those folks are available.
-
I see Christian.
-
Christian you may start,
you have two minutes.
-
[indistinct]
-
You have to turn off one
computer and on your phone
-
or something, you're
gonna have to turn off
-
one of the devices.
-
CHRISTIAN: Can you hear me fine now?
-
Much better, thank you.
-
CHRISTIAN: My name is Christian De Leon,
-
I'm here today as a member
-
of the Student Immigrant Movement
-
in support of banning facial
recognition in Boston.
-
Face surveillance is not only
a huge invasion of privacy,
-
but in the wrong hands can be used
-
to even further oppress minorities.
-
It can be used to track
immigrants, protestors,
-
and inaccurate when tracking
people of darker complexion.
-
As we know, this can be very problematic
-
since these tools target these communities
-
through racial profiling already.
-
Having surveillance tools
like facial recognition
-
will contribute to an even
greater systemic oppression.
-
There's also has no business
being in our schools.
-
To my knowledge, some
schools are already investing
-
in face recognition software
-
and are investing millions
in this new technology,
-
and for what exactly?
-
To track their students every move?
-
I believe school should
spend that time and money
-
on bettering their school
and education systems,
-
rather than spending it
on software and tools
-
that will no way improve
students' ability to be educated.
-
We are already living in a time
-
where technology can do scary things
-
and the last thing we need
-
are people watching our every
move on a day-to-day basis.
-
During this pandemic,
-
we have grown accustomed to wearing masks
-
and covering our faces in public,
-
well, if we now have facial recognition
-
on every street corner,
-
then this temporary thing
the world has adapted to
-
may just become permanent.
-
Not only to protect our own privacy,
-
but to not be falsely accused of crimes
-
due to falsely recognition software.
-
2020 has had a great display
of racism and brutality
-
against people of color,
-
particularly the black community,
-
and I fear that systems such
as facial recognition software
-
in the hands of law enforcement,
-
can just be another tool
they use against us.
-
Thank you for listening to my concerns.
-
Thank you very much.
-
Clara Ruiz.
-
CLARA: Hello.
Hello.
-
You have two minutes Clara.
-
CLARA] : Thank you so much.
-
My name is Clara Ruiz and
I'm writing in support
-
of the ordinance banning
facial recognition technology
-
in Boston presented by
councilor Wu and Arroyo.
-
This ordinance establish
a ban on government use
-
of face surveillance
in the city of Boston.
-
Boston must pass this
ordinance to join Springfield,
-
Somerville, Cambridge,
Brooklyn, North Hampton
-
in protecting racial justice
privacy, and freedom of speech.
-
This mattered to me because
I'm a member of the BIPOCC,
-
Black Indigenous and
People of Color Community.
-
And this is allowing for the community
-
to increase in racial biases.
-
This technology does not
protect constitutional rights
-
and values, including
privacy in an immediate
-
[indistinct]
-
appropriate speech and
association, equal protection
-
and government accountability
in transparency.
-
And the government must be
aware of the technical limits
-
of even the most
promising new technologies
-
in the release of relying
on computer system
-
that can be prone to error.
-
Facial recognition is creeping
-
into more and more law
enforcement agencies,
-
but later notice or oversight.
-
And there are practically
no laws regulating
-
this investive surveillance technology.
-
Meanwhile, ongoing technical limitations
-
to the accuracy of facial recognition
-
create serious problems
like misidentification
-
and public safety risk.
-
I refuse to sit back and
put my community on the risk
-
than it already is.
-
These surveillance technologies
-
are intentionally set up to
the disadvantage of brown
-
and color people, communities.
-
It has a better success rate
when it comes to white males,
-
meaning this technology is a unreliable
-
and it will do more harm,
-
if we don't ban the surveillance tool.
-
This technology will allow
for misidentification
-
to purposely happen against
black and brown people.
-
A ban on face surveillance technology
-
is critically important
in the city of Boston
-
because we need to keep
the community safe,
-
because all life matters.
-
A federal government government's study
-
published in December 2019,
-
conduct face recognition algorithms
-
were most likely to fail
-
when attempting to identify
the faces of people of color,
-
children, elderly and woman.
-
That means the test only reliably works
-
on middle aged white men.
-
If really a small fraction
of Boston residents,
-
I encourage you to pass to press pause
-
on the use of face surveillance
by government entities
-
in the city of Boston,
-
by supporting and
passing this crucial ban.
-
We cannot allow Boston
to adopt authoritarian,
-
unregulated, bias surveillance technology.
-
Thank you for the opportunity to testify.
-
And the way, Diana Serrano,
-
is also gonna be testifying through this.
-
Diana you have two minutes.
-
DIANA: Okay, hello, my
name is Diana Serrano,
-
good evening.
-
I'm writing on behalf of SIM
in support of the ordinance
-
banning facial recognition
technology in Boston.
-
I urge my city of Boston
to pass this ordinance
-
to join Springfield,
Somerville, Cambridge, Brooklyn
-
and North Hampton in
protecting racial justice,
-
privacy and freedom of speech.
-
It is our duty as a free and just city,
-
to ban technology like facial recognition.
-
As a victim of social media hacking,
-
I've seen how technology could
be used as a tool to defame
-
and criminalize with pictures
-
that feel like proof to the viewers.
-
As a teacher and civilian,
-
I refuse to allow those in my community
-
to be exposed to a highly problematic
-
and technology like facial recognition.
-
Not only is privacy important
-
and should be valued
by our law enforcement,
-
face surveillance is
proven to be less accurate
-
for people with darker skin.
-
With the high levels of
racial discrimination
-
forced on our black and brown communities
-
by law enforcement.
-
The racial bias in face surveillance
-
will put lives in danger even
further than they already are.
-
A ban on face surveillance technology
-
is critically important
in the city of Boston
-
because that technology
really is only reliable
-
for a very small fraction
of the population.
-
It's highly important
to me to live in a city
-
where it is critical for
leaders to keep our city
-
and its people safe.
-
Although it may seem as
though facial recognition
-
could maybe keep us safer, it
has proven do the opposite.
-
I urge you to press pause on
the use of facial surveillance
-
by government entities
and the city of Boston
-
by supporting and passing the crucial ban.
-
We cannot allow Boston to adopt
-
authoritarian, unregulated,
bias surveillance technology.
-
Thank you for your
attention, Diana Serrano.
-
Thank you very much.
-
Oliver De Leon, you have two minute.
-
OLIVER: Hi, my name is Oliver
-
and I live in Jamaica Plain.
-
And I was undocumented when I attended
-
the Boston public schools
in early nineties.
-
So yes, it's a while back, so
I became a citizen in 2016.
-
I look back on my days in high school
-
while contemplating this new technology,
-
facial recognition and I
can sincerely tell you that
-
it terrified me the
thought of having somebody
-
access this technology without
my consent acknowledge.
-
I don't know if this would have allowed me
-
to finish school at that time.
-
And it would have been hard
-
to even finish possibly higher education.
-
Without that education,
-
I wouldn't have been able
to secure job opportunities
-
like I have now with a paying job
-
that helps me provide for my family.
-
So we started thinking about that.
-
We start see seeing how
this facial recognition
-
can actually have a negative impact
-
in the education of our
youth and our future.
-
People already talk, how are the U.S.
-
is already riddled with racial bias,
-
and this software is just
giving people already empowered
-
a tool to discriminate even more.
-
For example, I.C.E has already
shown up in the protests
-
and detained a few people.
-
Now we have to ask ourselves
-
how the heck do they
pick out these few people
-
in the thousands of protesters?
-
I mean, one can only speculate,
-
but one of the few answers
-
can be facial recognition software.
-
Do we not have it?
-
Do they have it?
-
How can we show that they're not using it?
-
Well, it's pretty hard to pick
out two or three individuals
-
out of a thousands and thousands
-
and thousands of protestors.
-
So it's up to us to kind
of speculate what that is.
-
The Coronavirus has caused major epidemic.
-
The police brutality has
caused major epidemic.
-
I guess we have to ask ourselves,
-
are you allowing the
next epidemic to happen?
-
Maybe.
-
Banning facial recognition
in the city of Boston
-
will bring us to be in the
middle of the next battle.
-
And which side do we wanna be on?
-
Let's not approve this software.
-
I urge you to to ban facial
recognition technology
-
as it also affects our youth.
-
We do not need a system or software
-
that can pick up individuals
-
and potentially erroneously
-
accuse them of something
that they haven't done.
-
Will the next death is
caused by your choice today.
-
Let's hope not, thank you.
-
Thank you, thank you very much.
-
It's 10 to six.
-
I'm going to have to stop
sharing at six o'clock.
-
That only means that I will
be turning over the gavel
-
to the one of the lead sponsors,
-
Michelle Wu at six o'clock.
-
But I'm gonna call the
next names of individuals
-
who are in line and have
signed up to speak before I go.
-
And we'll continue chair.
-
I've got an Angel, Michelle
Raj Mon, Leon Smith,
-
Amy van der Hiel, excuse me,
Julie McNulty and Zachary Lawn.
-
So is the Angel available?
-
ANGEL: Yes.
-
Hello everyone, thank
you for being here today.
-
Thank you.
-
ANGEL: My name is Angel
and I am in fresh grade
-
and go to a Boston public school.
-
I am seven years old.
-
To me surveillance means
people watching you
-
without even knowing.
-
I would feel frustrated
-
because I do not want anyone watching me,
-
especially if I'm home
at school or at a park.
-
Facial recognition is
when they see your face
-
and know all the details,
-
but this does not work for everyone
-
because most police do not like people
-
who are not their skin color.
-
This tool is not going to be equal
-
to people who are black,
Latino, immigrants and more,
-
it can get the wrong person.
-
I am young, as I get older,
I will not look the same.
-
I am seven now and when I turned
13, I will look different.
-
When I turn 18, I will look different.
-
I get older and my face changes.
-
If these tools are used new
school, I will not feel safe.
-
We are just children.
-
Our teachers are already
taking care and watching us.
-
We do not need police watching us.
-
Thank you SIM, for your love and support.
-
Well, I have to say,
-
I think you're the youngest
person that's testified ever.
-
And in anything I've ever shared,
-
or I had that participation,
-
I know we're all on Zoom,
-
but I will give you a round of applause.
-
Thank you so much.
-
That was absolutely beautiful,
very powerful testimony.
-
We're so proud of you.
-
Your city council and
city are so proud of you.
-
Thank you so much for
your testimony Angel.
-
I'm going to now call on,
-
I don't think I see Michelle or Leon
-
and don't share them either.
-
Okay, so I see Amy, who
was we called earlier?
-
So Amy, you have two minutes
-
and you get to follow the
cutest seven-year-old ever.
-
[Lydia laughs]
-
Go ahead,
-
you're on mute.
-
Thank you, I applaud
her testimony entirely.
-
Thank you madam chairwoman and the council
-
for hearing our testimony,
I'll be very brief.
-
I just wanted to say my
name is Amy van der Hiel.
-
I live in Roslindale
-
and I support the ban on
face surveillance, thank you.
-
Thank you very much, Michelle.
-
I have a Michelle Raj Mon and
I see that Michelle Barrios.
-
Are you...
-
Okay, that was the mistake.
-
Very well then, Michelle
you have two minutes.
-
That's okay.
-
And I also am not following
Angel as a bit much.
-
So hello.
-
My name is Michelle Barrios
-
and I am a social studies
teacher in the Boston area.
-
And today I am here speaking on behalf
-
of the Student Immigrant Movement,
-
myself as an educator
-
and on the behalf of my
diverse student population.
-
I am speaking in support of the ordinance
-
banning facial recognition
technology in Boston,
-
presented by councilors Wu and Arroyo.
-
This ordinance established
a ban on government use
-
of facial surveillance
in the city of Boston.
-
Boston must pass this
ordinance to join Springfield,
-
Somerville, excuse me,
Springfield, Somerville,
-
Cambridge, Brooklyn and North Hampton
-
in protecting racial justice,
privacy and freedom of speech.
-
In my training as a
social studies teacher,
-
I became increasingly aware of the trauma
-
that students of color develop
-
as they grow in a society
-
that seeks to police them on
the basis of their ethnicity,
-
economic status and place of residence.
-
This trauma is exacerbated
when a young person
-
comes from an immigrant family,
-
the status in this country,
-
places them at risk of family separation,
-
a loss of education and a
stripping of basic human rights.
-
We know that students struggle
to succeed academically
-
when they face these
daily existential traumas,
-
which in essence is a
violation of their rights
-
in the U.S. to pursue survival,
-
let alone social mobility
and a future of stability.
-
In addition to my training
and work as an educator,
-
I am the wife of a Latino immigrants
-
and the mother of two
of my sisters children,
-
while I am a white Latina
-
and I can live under the
presumption of innocence
-
due to my white privilege,
-
I've witnessed firsthand what
it means to live in a society
-
that has not yet made good
on its promise of equality,
-
liberty, and justice for all.
-
My husband is followed in stores,
-
teachers hesitate to
release our children to him
-
when they first meet him
-
and he has been asked for
his immigration status
-
by customers in his place of work.
-
We are in the fortunate
position to know that his life
-
and our family can count
on his permanent residency
-
to keep us together for now.
-
However, I cannot say the
same for many Latin X families
-
in the greater Boston area,
-
a ban on face surveillance technology
-
is critically important
in the city of Boston
-
because this surveillance
harms our privacy
-
and our freedom of speech,
-
a fundamental right in our constitution.
-
This type of surveillance
threatens to create a world
-
where people are watched and identified
-
as they exercise their right to protest,
-
congregate at places of worship
-
and visit medical providers,
-
as they go about their daily lives.
-
For my students of color,
-
specifically the young
men growing up in a world
-
that still cannot fully accept
that their lives matter,
-
face surveillance technology is not only
-
another threat to their lives,
-
but a response saying
to these young people,
-
you still don't matter.
-
I teach world history as well
as comparative government
-
and politics, and when I
teach about authoritarianism,
-
I teach about the impact
on marginalized groups
-
in countries that are still
considered developing.
-
I teach that in these free countries
-
or in these recently free countries,
-
surveillance is one of
the most common tools
-
to manage descent.
-
And that is often followed
by state sanctioned violence
-
as means to silence any dissidents.
-
I urge each of you to
consider the statement
-
the statement that you
would make by not banning
-
the use of facial surveillance
technology in Boston.
-
I urge each of you to imagine
a 14 year old student of color
-
or a Latino essential
worker in front of you,
-
asking if their lives matter.
-
What is your response?
-
I encourage you to consider that moment
-
and to press pause on the
use of face surveillance
-
by government entities
in the city of Boston,
-
by supporting and
passing this crucial ban.
-
We cannot allow Boston
to adopt authoritarian,
-
unregulated, bias surveillance technology.
-
Thank you for your attention
and your consideration.
-
Thank you very much.
-
At this point, I'm going
to turn over the microphone
-
or the gavel if you will, to
my colleague, Michelle Wu.
-
I'll just say up next two
speakers should be getting ready.
-
Is Julie McNulty, Zachary
Lawn, Christina Rivera,
-
and Joseph Fridman.
-
I wanna thank you so
much to the lead sponsors
-
of this ordinance.
-
I will be planning to
have a working session
-
as soon as possible.
-
I do understand the
urgency and I do believe
-
that there is a great
opportunity before us to lead
-
in a way that not only values the lives
-
of our black and brown, but in
general, our civil liberties.
-
And so I want to thank again,
-
lead sponsors for their leadership.
-
I look forward to getting this done
-
and with that I'm going
to have to sign off
-
and I turn over to Michelle Wu.
-
Thank you.
-
Thank you so much, madam chair.
-
We're very grateful for you.
-
Okay, next step was Julie.
-
And if not, Julie, then Zachary.
-
And just to know, could
council central staff
-
give me administrative rights,
-
so I can help search whether folks
-
are stuck in the waiting room
-
and trans people over into the
panelists section, thank you.
-
So again, it was Julie,
Zachary, Christina Rivera,
-
Joseph Fridman, Danielle
Samter, for the next few.
-
Christina, why don't you go ahead.
-
I see you now in the, in the main room.
-
Hi, yes.
-
Thank you for having me as stated.
-
My name is Christina Rivera
-
and I am an east Boston resident.
-
I'm also the founder of
the Latin X Coalition
-
for Black Lives Matter.
-
It is a support organization
-
to help end racism within our community,
-
but also help advance the work
-
of the Black Lives Matter Movement.
-
I wanna say that we as an organization,
-
we fully support the ordinance
-
of a ban on facial recognition technology.
-
As stated all throughout
this meeting earlier,
-
we know that the automation bias exists
-
and that facial recognition technology
-
clearly shows bias towards
already highly targeted
-
and hyper policed communities,
-
specifically those of color.
-
And I just wanna pose the question,
-
how can we begin to trust
-
even with the development
of algorithm accuracy,
-
that those who will use it in the future
-
will not abuse its capabilities.
-
We already have a police force
-
that has shown time and time again,
-
that some of their own
procedures are still abused
-
to this date and used by bad actors.
-
So adding a tool that can be used
-
to further affirm racial bias
-
only provides another route
to target our communities.
-
Using this technology also
enforces a policing system
-
that is based on control,
-
and that is a direct
symptom of police mistrust
-
that already exists of their own citizens
-
and the ones that they serve.
-
And so, as some people earlier stated,
-
specifically Mr. Daley point about
-
what government agencies are
allowed to already use this
-
versus people who have in inactive bands,
-
what is also going to
be the legislative power
-
that any federal agency
will be able to use it
-
in individual states,
-
specifically when it comes
to I.C.E and immigration.
-
So this is something that
we are looking to make sure
-
that it's not only is banned now,
-
but also cannot be allowed in the future.
-
With that said,
-
thank you so much to all the panelists
-
who've shared all this great information.
-
Thank you so much to councilor
Wu and councilor Arroyo
-
for doing this work and I concede my time.
-
Thank you.
-
Thank you so much, Christina.
-
We appreciate you, Joseph.
-
Yes, thank you for your time.
-
I'd like to tell you a
little bit about the science
-
between emotion detection technologies.
-
So these are technologies
-
that use information captured
from face surveillance
-
to supposedly recognize emotions.
-
So I'll be speaking to
you as a private citizen,
-
but I'll tell you that
for the past three years,
-
I've been working at a lab
at Northeastern University,
-
which studies emotions,
-
and I've been a research assistant
-
at the Harvard Kennedy Schools
Belfer Center for Science
-
and International Relations,
-
thinking about the same
type of technology.
-
At the interdisciplinary
aspect of science lab
-
at Northeastern,
-
one of my bosses is Dr.
Lisa Feldman Barrett.
-
Dr. Barrett has research
appointments at MGH
-
and Harvard Medical School.
-
She's the chief scientific
officer at MGH Center
-
for Law, Brain and Behavior.
-
She's the author of
-
over 230 peer reviewed scientific papers,
-
and just finished a term as president
-
of the Association for
Psychological Science,
-
where she represented tens
of thousands of scientists
-
around the world.
-
Three years ago,
-
this organization commissioned
her and four other experts,
-
psychologists, computer
science, and so on,
-
to do a major study
-
on whether people across
the world express emotions
-
like anger, sadness,
fear, disgust, surprise,
-
and happiness with distinctive
movements of the face.
-
So all of these scientists
-
started with different perspectives,
-
but they reviewed a
thousand research articles
-
and they came to a very simple
and very important consensus.
-
So the question is,
-
can you read emotions
reliably in a human face?
-
And the answer is no.
-
Here's an example from Dr. Barrett.
-
People living in Western
cultures scowl when they're angry
-
about 30% of the time,
-
this means that they make
other facial expressions,
-
frowns, wide-eyed, gasps
smiles about 70% of the time.
-
This is called low reliability.
-
People scowl when angry more than chance,
-
but they also scowl
when they're not angry.
-
If they're concentrating or if
they have something like gas,
-
this is called low specificity.
-
So a scowl is not the expression of anger,
-
but it's one of many expressions of anger.
-
And sometimes it doesn't
express anger at all.
-
In this paper, Dr.
Barrett and her colleagues
-
showed that scowling and
anger, smiling and happiness,
-
frowning and sadness,
-
these are all Western stereotypes
of emotional expressions.
-
They reflect some common beliefs
-
about emotional expressions,
-
but these beliefs don't correspond
-
to how people actually move
their faces in real life.
-
And they don't generalize to cultures
-
that are different from ours.
-
So it's not possible for anyone
-
or anything to read an
emotion on our face.
-
And we shouldn't confidently
infer happiness from a smile,
-
anger from a scowl or
sadness from a frown.
-
There are technologies
that claim to do this,
-
and they're misrepresenting
what they can do
-
according to the best available evidence.
-
People aren't moving their faces randomly,
-
but there's a lot of variation
-
and the meaning of a facial movement
-
depends on a person on the
context and on culture.
-
So let me give you one example
-
of how making this assumption
can be really harmful.
-
There's a scientist, Dr. Lauren Ruth,
-
when she was at Wake Forest University,
-
published work reviewing two popular
-
emotion detection algorithms.
-
She ran these algorithms algorithms
-
on portraits of white
and black NBA players.
-
And both of them consistently interpreted
-
the black players as having
more negative emotions
-
than the white players.
-
Imagine that there's a security guard
-
that gets information
from a surveillance camera
-
that this is possible nuisance
-
in the lobby of their building
-
because of the facial movements
that somebody is making.
-
Imagine a defendant in the courtroom
-
that scowling as they
concentrate on legal proceedings
-
but in emotion AI, based
on facial surveillance
-
tells the jury that
the defendant is angry.
-
Imagine that the defendant is black
-
and the AI says that they're contemptuous.
-
Imagine at worst, that police officer
-
receiving an alert from at a body camera
-
based on this technology
-
that says that someone in
their line of sight is a threat
-
based on a so-called
reading of their face.
-
Councilors I think that the question
-
that we should be thinking about
-
is whether we want someone
to use this technology on us
-
with the chance that it
would misinterpret our face
-
given the major chance
that it has of being wrong.
-
And I think the clear
answer to that is no.
-
We should ban facial
surveillance technologies
-
and all of these other
dangerous applications.
-
Thank you.
-
Thank you very much.
-
Next up is Danielle and
Danielle will be followed
-
by DJ Hatfield and Denise Horn.
-
Hi, good afternoon everybody.
-
My name is Danielle,
I'm a Boston resident,
-
I live in the north end.
-
I just wanted to fully support
-
and put my voice behind the ban
-
for facial recognition
recognition technology.
-
I feel as though the expert witnesses
-
and activists that have gone before me,
-
have clearly expressed and
eloquently expressed the views
-
that I also share.
-
I find this technology to
be incredibly dangerous.
-
As a cybersecurity professional,
-
I have a lot of questions about
-
what it means to be
collecting data on people
-
and storing it.
-
I certainly think that this is a measure
-
that should be taken to be a precedent
-
for all future developments
of the technology
-
as was just stated.
-
There are upcoming and new ways
-
to express facial recognition
through emotions or otherwise
-
that we can't even comprehend
yet because it hasn't started.
-
So I wanted to support it and
thank all the other people
-
for putting their voice behind it
-
and hope to see it take place, thank you.
-
Thank you very much, DJ.
-
Hi, will start the video.
-
So I'm DJ Hatfield.
-
I'm a professor of
anthropology and history
-
at the Berkeley College
of Music in Boston.
-
However, what I have to
say is as a private citizen
-
and a resident of east Boston,
-
that's a shout out to
Lydia, east Boston pride.
-
Yes, as an anthropologist I'm
aware that all technologies,
-
including digital technologies
-
reflect an augment the social structures
-
and systems of value of
particular societies.
-
They are not objective,
-
but reflect the biases of those societies
-
as they develop through history.
-
Moreover, if we look at societies
-
where facial recognition
has been widely deployed,
-
such as the People's Republic of China,
-
we see very clearly how these technologies
-
have been used systematically
-
to target marginal populations.
-
This is a quality of the weight
-
that these technologies
intersect with social systems
-
that act against people of color
-
and indigenous people, globally.
-
Therefore, in the city of Boston,
-
where we take pride in our civic freedoms,
-
we should be wary of technologies
-
that would make our Fourth
Amendment meaningless
-
and would erode the First Amendment.
-
Therefore, I would urge the city council
-
and the city of Boston
to pass a complete ban
-
on facial recognition
systems and technology,
-
with the amendments that have
been proposed by experts,
-
such as those of the
Electronic Freedom Foundation,
-
and the Algorithmic Justice League,
-
who have said there should be no exception
-
for the Boston Police Department
to use facial recognition
-
or surveillance systems in evidence,
-
in which there should be strict rules
-
for enforcement of breaches of this ban.
-
And finally, if possible,
extending this ban
-
to all private actors who have business
-
in the city of Boston.
-
Thank you for your consideration
-
and thank you for all your time today.
-
I appreciate being able to join
-
in my very first public hearing.
-
Thank you.
-
We are excited to have you at our hearing
-
and hope you'll come back, Denise next.
-
And then after Denise,
-
there are a few names assigned up
-
that I didn't see in the room.
-
So I'm gonna call out folks
-
that I was able to cross
reference between the waiting room
-
and the list, which was Charles Griswold
-
and Nora Paul-Schultz.
-
And then after that,
-
I'm gonna try to admit everyone else
-
who's in the room and go in order,
-
even if you weren't on the original list.
-
So Denise, please.
-
Thank you.
-
Thank you members of the council
-
and the lead sponsors of the ordinance
-
and the panelists, really
excellent commentary.
-
And I just have to say how impressed I am
-
with the Student Immigrant
Movement for showing up today,
-
this really impressive.
-
My name is Denise Horn, I'm
a resident of Jamaica Plain.
-
I'm also a professor of political science
-
and international relations
at Simmons University.
-
Although I am not representing
my institution today.
-
And I want to say that I support the ban
-
against the use of facial surveillance.
-
So, like professor Hatfield,
-
I am a scholar of international relations,
-
I have studied authoritarian
regimes in Eastern Europe
-
and east and southeast Asia,
-
where surveillance technologies
of every kind have been
-
and are used to target
enemies of the state.
-
In the United States we're
already seeing this language
-
in use by the leaders in our government,
-
regarding immigrants and activists,
-
and especially those who
are opposed to fascism.
-
I do not think that we
should underestimate
-
how this type of technology could be used
-
to further racists an
authoritarian agendas here.
-
The use of facial surveillance
is used to oppress
-
and control citizens under
the guise of public safety.
-
This is a slippery slope.
-
Government surveillance
is used to silence,
-
to suppress speech and
to violate human rights.
-
As an educator, I worry about this type,
-
how this type of technology will be used
-
to adversely affect the lives
of my students of color,
-
trans students, undocumented
students, activists
-
and marginalized groups,
their lives matter.
-
I strongly support this ban
-
and I thank you for letting me speak.
-
Thank you, Denise, Charles.
-
Yep, so I'm trying to start
the video here, there we are.
-
Can you see me and hear me?
-
Yes we can.
Okay, thank you.
-
My name is Charles Gridwold,
I'm a resident of Brighton
-
and I'm a philosophy professor
at Boston University.
-
I hasten to add that I'm
offering my thoughts here
-
and in offering my thoughts,
-
I am not speaking for my university
-
and not acting on his behalf.
-
So I'm speaking on my own behalf only.
-
I'm very grateful to all
of you on the council
-
for taking action on this matter
-
and to the police commissioner
for his impressive
-
and in some ways reassuring testimony.
-
It's imperative I think,
-
that you approve this ban on
facial recognition technology,
-
or rather technologies as we've
learned today by government
-
before the relevant software is upgraded.
-
So far as I can tell,
-
three arguments for the ban
have been discussed today.
-
The first two extensively
and the third a bit less so
-
and taken together, they
seem to me to be decisive
-
in making the case for the ban.
-
The first is of course
-
the matter of racial and gender justice,
-
on account of the fact that the technology
-
is known to be inaccurate.
-
But secondly, is the
fact that the technology
-
is objectionable.
-
Not just because it's not accurate,
-
but also when it is accurate,
-
as Kade Crockford has
eloquently argued today.
-
And that seems to me
to be an essential part
-
of the response to the
commissioner's remarks.
-
In some of the technology puts
tremendous additional power
-
in the hands of government and its agents,
-
all the more so when it actually does work
-
and our privacy among other things
-
is certainly threatened by that.
-
Where could that lead, look no further,
-
as people have suggested to the nightmare,
-
that is the surveillance
apparatus in China today.
-
The third point is, has been mentioned,
-
but it was less emphasized.
-
And I think it's just worth underlining,
-
which is that the awareness
-
of being personally tracked
outside of one's home
-
by this technology can,
-
and I think will have a chilling effect
-
on our exercise of our civil rights,
-
including our rights of free speech,
-
freedom of association and our religion.
-
So I strongly urge the
council to pass this ban,
-
but I also ask that you
will not stop there,
-
expand the band even further,
-
going forward to include other kinds
-
of remote surveillance systems,
-
such that are biometric for example,
-
a gate and voice recognition systems,
-
and also to the private
sector has been suggested.
-
So I hope that you will consider
expeditiously passing this,
-
but also going further.
-
Thank you so much for listening.
-
Thank you very much, Nora.
-
Good evening, my name
is Nora Paul-Schultz,
-
and I am a physics teacher
in Boston public schools.
-
And I've had the pleasure
of teaching Eli and Esmalda
-
who spoke earlier and
hopefully one day Angel.
-
And I'm a resident of Jamaica Plain.
-
As someone who studied
engineering when I was in college,
-
has taught engineering
-
and is one of the robotics
coaches at The O'Bryant,
-
I understand the impulse to feel like
-
technology can solve all of our problems.
-
I know that many for
many there is a desire
-
that if we have better
smarter and more technology
-
than our communities will be safer,
-
but the reality is that
technology is not perfect.
-
A perfect unbias tool,
-
as much as we wish it would be.
-
Our technologies reflect the
inherent biases of our makers.
-
So technology isn't going to save us.
-
If it does not take
much to see that racism
-
is infused in all parts of our country.
-
And that is true about the
facial surveillance technologies.
-
Through my work with Unafraid Educators,
-
the Boston Teacher Unions,
-
immigrant rights organizing committee,
-
the information sharing
between Boston School Police
-
and the Boston Police Department,
-
I know how detrimental observations
and surveillance can be.
-
Practices of surveillance
-
like camera recording in the hallway
-
or officers watching young
people are not passive.
-
Surveillance is active.
-
It leads to the creation
of law enforcement profiles
-
about young people and
that can impact their lives
-
in material ways for years to come.
-
And what of those impacts
being in the system,
-
can make it harder for
young people to get a job
-
or get housing.
-
It can lead to incarceration,
it can lead to deportation.
-
Our city government
does not need more tools
-
to profile the people of our city
-
and especially does doesn't need tools
-
that are inherently racist.
-
This is why the ban is so important.
-
Teaching has taught me
that what keeps us safe
-
is not having the government surveillance
-
and track us through technology,
-
what keeps us safe is
investing in housing,
-
education and healthcare.
-
This face surveillance
technology harms our community,
-
Ah!
-
Sorry.
-
This surveillance technology
harms our community.
-
The fact that the face ban technology
-
only identifies black women
correctly one third of the time
-
lays bare both the ineffectiveness
-
and implicit bias built into the system.
-
As a teacher I know that
that is not a passing grade.
-
We need money to go to services
-
that will protect our community,
-
and we need to stop and prevent
-
the use of ineffective racists
-
and money wasting technologies.
-
Thank you.
-
Thank you, Nora and we
appreciate all of your work.
-
I'm gonna quickly read the list of names
-
that were signed up to testify,
-
but I didn't see matches
of the names of folks
-
left in the waiting room.
-
So just in case you're assigning
it on a different name,
-
but I'm expecting that these folks
-
probably aren't here anymore.
-
Julie McNulty, Zachary
Lawn, McKenna Kavanaugh,
-
Christine Dordy, Sarah Nelson,
Andrew Tario, Amanda Meehan.
-
Chris Farone, Bridget
Shepherd, Lena Papagiannis.
-
Anyone here and signed on
under a different name?
-
Nope, okay.
-
Then I will just go on the order
-
of folks appearing on my
screen to close this out.
-
Thank you so much for your patience.
-
So that will be Mark G then Adeline Ansell
-
then Carolina Pena.
-
Yes, so this is Mark Gurvich,
-
I'm presenting testimony on behalf of
-
Jewish Voice for Peace,
-
a local and national organization
-
that's guided by Jewish
values, fighting racism,
-
sexism, Islamophobia, LGBTQ, oppression
-
and anti-immigrant policies here
-
and challenging racism, colonialism
-
and oppression everywhere,
-
with a focus on the injustices
suffered by Palestinians
-
under Israeli occupation and control.
-
In recent days in the U.S.
we have again witnessed
-
the impact of militarized policing
-
on communities of color.
-
Councilor Wu has pointed
out in a recent publication
-
about this militarization
-
and much of this
militarization has been through
-
the transfer of billions of
dollars worth of equipment
-
from the military to
civilian or police forces,
-
and the training of U.S. police
-
and the use of military tactics
-
in controlling the communities,
-
most heavily impacted by racism,
-
economic depletion, and health crises.
-
Our view is that the use of
facial recognition technology
-
is a key component of the
militarization of policing.
-
We know this because of our knowledge
-
and experience of the Israeli occupation
-
of Palestinian people in land.
-
Facial recognition is a core
feature of that occupation.
-
Much of the technology being marketed
-
to the U.S. law enforcement,
-
has been field tested on Palestinians
-
under military occupation.
-
At this historic and critical time,
-
when the effects of militarized
policing in the U.S.
-
has resulted in nationwide
and worldwide condemnation,
-
the very last thing we
need is another tool
-
adopted from the toolbox
of military occupation.
-
Jewish Voice for Peace, supports
the face surveillance ban.
-
Thank you.
-
Thank you very much.
-
Adeline or Adeline, sorry
to mispronounce your name.
-
Adeline Ansell.
-
We'll move on to Carolina
or Carolina Pena.
-
Let me see.
-
I will unmute on my own in case
-
it's a muting and unmuting issue.
-
Then we'll move on to Kristen Lyman.
-
Kristen will be followed by Galen bunting,
-
Maria Brincker and Araya Zack.
-
Kristin.
-
Okay, going to Galen,
-
Maria Brincker.
-
MARIA: Yes.
-
Hi, I had actually not
necessarily prepared.
-
Hello, can you hear me?
-
Adeline, will go to your
right right next up to Maria.
-
Sorry about the confusion.
-
ADELINE: Okay.
Okay, no problem.
-
So I just wanted to thank everybody
-
and I wanna say that I am very much,
-
I'm an associate professor at UMass Boston
-
and both as an educator,
-
and as a researcher in
issues of surveillance,
-
I am very much in favor of the ordinance.
-
I would like to say that,
-
I would also like the council
to consider wider regulations.
-
So as many have said before,
-
there's a lot of issues
also in the private sphere.
-
So this is a very important first step,
-
but a lot of the use of facial recognition
-
in law enforcement,
-
really interacts with the use
of surveillance technologies
-
in the private space.
-
So for example, when the
Brooklyn Police Department
-
were trying to argue
against the Brooklyn ban,
-
they used an example of
police using Snapchat.
-
So a lot of speakers before
have talked about the importance
-
of anonymity in public space,
-
and I'm fully in support of that,
-
but this is not only about public space.
-
So the use of facial recognition software
-
is of course invading all
our private spaces as well.
-
So I would like people
to be aware of that.
-
And as a teacher right now,
-
with the epidemic teaching on Zoom,
-
you can only imagine
the deployment of voice
-
and face recognition in all our spaces.
-
And so I think this is very important.
-
And another thing,
-
my background is also in
philosophy of neuroscience
-
and in the understanding
of autonomous action.
-
And autonomous action
depends on being in a space
-
that we can understand.
-
And one of the reasons
why facial recognition
-
is one of the most dangerous inventions,
-
is that it precisely ruins
the integrity of any space
-
because we don't know who's there
-
and it can do so retroactively.
-
So I thank you very much
for all your work on this,
-
and I'm fully in support of
the ordinance, thank you.
-
Thank you so much.
-
Okay, next we'll go to Adeline
-
Hi, I'm Adeline Ansell.
-
I was just coming out to
support the ordinance as member
-
of the Unafraid Educators group
-
of the Boston Teachers Union,
-
to protect our immigrant students
-
and who have been
advocating for themselves
-
through the Student Immigrant Movement
-
to support this ordinance
-
banning facial recognition technology,
-
because it puts our
students more so at risk.
-
And I don't have a lot to add.
-
I just wanna support
the work they're doing
-
in their student advocacy.
-
And I know they all spoke for themselves
-
and in their allies did as well,
-
and just wanted to let you
know that Boston teachers
-
also support this ban.
-
So thank you for listening
to all of their testimony
-
and know that the teachers
are on their side too,
-
and that we support this
ordinance, thank you.
-
Thank you.
-
Okay, last call.
-
Either Carolina or Galen
-
or anyone else who would like to speak.
-
I think everyone's been admitted
from the waiting room now.
-
Okay, seeing no takers,
-
I'll pass it back if any councilors
-
wish to make a closing
statement before we wrap up.
-
I see I'll save councilor Arroyo
for right before we close,
-
councilor Annissa Essaibi-George.
-
Thank you madam chair at this point
-
and thank you to you and council Arroyo
-
for bringing this before the council.
-
I've really appreciated
everyone's testimony today
-
in both hearings and sort of the,
-
certainly a very direct correlation
-
between some of the testimony today,
-
but a particular interest of mine
-
was the testimony of the young people
-
and the testimony of the teachers,
-
in particular...
-
So just thank you, thank you both.
-
Thank you councilor Wu for
staying after the chair
-
had to leave to extend an opportunity
-
for this continued testimony.
-
It was certainly important testimony,
-
but just the fact that everyone stayed on
-
as long as they did to
offer that testimony,
-
I think is really important to recognize,
-
and to be grateful for that engagement.
-
And thank you to you and
thank you to the maker.
-
Thank you councilor
-
and final closing words from
my co-sponsor councilor Arroyo.
-
Just a sincere thank you to our panelists.
-
I think many of them are off now,
-
but thank you so much to them
-
and everybody who gave testimony,
-
much of that was deeply moving.
-
Certainly Angel was a highlight.
-
So thank you Angel,
-
and anybody who can get
that message to Angel,
-
I'm not sure if it's bedtime.
-
Thank you so much for raising your voices
-
and really being focused
at a real municipal level.
-
This is what we've always asked for.
-
This is what I grew up about Angel's age,
-
watching these meetings
-
and having folks basically beg
-
for this kind of interaction
from our communities.
-
And so I'm just deeply grateful
-
that whatever brought
you here specifically,
-
I hope you stay engaged.
-
I hope you bring this
energy to other issues
-
that really affect our
communities on a daily basis
-
because it really makes change.
-
It really changes things.
-
And so thank you so much to
everybody who took part in this,
-
thank you to all the
advocates as a part of this.
-
Thank you councilor Wu for making sure
-
that there was a chair here
-
and to councilor Edwards first
running it so well earlier.
-
But thank you to you for ensuring
-
that we can continue to
hear public testimony
-
because it really is valuable.
-
So with that, I thought
that was all very valuable.
-
And also thank you to the commissioner
-
for taking the time to be here
-
and to really endorse this idea.
-
I thought that was fantastic.
-
So thank you.
-
Thank you, this is a great, great hearing.
-
I think we all learned so much
-
and I wanna echo the
thanks to the commissioner,
-
all the community members who
gave feedback along the way
-
and over this evening, my
colleagues, the commissioner,
-
and of course, central staff,
-
or for making sure all of it
went off very, very smoothly.
-
So we'll circle back up
with the committee chair
-
and figure out next steps,
but hoping for swift passage.
-
And I'm very grateful again,
-
that we have an opportunity
in Boston to take action.
-
So this will conclude our
hearing on docket number 0683,
-
on an ordinance banning
facial recognition technology
-
in Boston.
-
This hearing is adjourned.
-
[gravel strikes]
-
Bye, see you.
Bye.